Skip navigation
The Habeas Citebook Ineffective Counsel - Header

Literacy Behind Prison Walls

Download original document:
Brief thumbnail
This text is machine-read, and may contain errors. Check the original document to verify accuracy.
38
42
18

21
12

U.S. Department of Education
Office of Educational Research and Improvement

NCES 1994–102

NATIONAL CENTER FOR EDUCATION STATISTICS
October 1994

Literacy Behind
Prison Walls

U.S. Department of Education
Office of Educational Research and Improvement

NCES 1994–102

U.S. Department of Education
Richard W. Riley
Secretary
Office of Educational Research and Improvement
Sharon P. Robinson
Assistant Secretary
National Center for Education Statistics
Emerson J. Elliott
Commissioner
National Center for Education Statistics
“The purpose of the Center shall be to collect, analyze, and disseminate
statistics and other data related to education in the United States and in
other nations.” — Section 406(b) of the General Education Provisions Act,
as amended (20 U.S.C. 1221e-1).
OCTOBER 1994
Contact: Andrew Kolstad, 202-502-7374

Ordering Information
For information on the price of single copies or bulk quantities of
this book, call the U.S. Government Printing Office Order Desk
at 202-783-3238.
The GPO stock number for this book is NCES 94-102
For more information,
write to:
Education Information Branch
Office of Educational Research and Improvement
U.S. Department of Education
555 New Jersey Avenue, N.W.
Washington, D.C. 20208-5641
or call:
1-800-424-1616
(in the Washington, D.C. metropolitan area,
call 202-219-1651), or FAX 202-219-1970.

The work upon which this publication is based was performed for the National Center for Education
Statistics, Office of Educational Research and Improvement, by Educational Testing Service.
Educational Testing Service is an equal opportunity, affirmative action employer.
Educational Testing Service, ETS, and

are registered trademarks of Educational Testing Service.

CONTENTS

Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

xi

Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

xiii

Executive Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
The Literacy Skills of Inmates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Experiences Prior to Prison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Prison Experiences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Recidivism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Literacy Practices and Self-Perception . . . . . . . . . . . . . . . . . . . . . . . .
Reflections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

xvii
xviii
xx
xx
xx
xxi
xxi

Chapter 1: Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Defining and Measuring Literacy . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Conducting the Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Reporting the Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
About This Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
A Note on Interpretations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1
2
5
7
12
13

Chapter 2: The Prose, Document, and Quantitative
Literacy Skills of America’s Prisoners . . . . . . . . . . . . . . . . . . . . . . . .
The Correctional Population at the End of 1991 . . . . . . . . . . . . . . . .
Characteristics of the Prison and Household Populations . . . . . . . . .
Results for the Prison Population . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Comparison of the Prison and Household Populations . . . . . . . . . . .
Comparisons of the Prison and Household Populations by
Education, Race/Ethnicity, Sex, and Age . . . . . . . . . . . . . . . . . . . .
Results by Educational Attainment . . . . . . . . . . . . . . . . . . . . . .
Results by Race/Ethnicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Results by Educational Attainment and Race/Ethnicity . . . . . .
Results by Sex . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Results by Age . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Disabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Regression Analyses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

15
16
17
17
19
21
21
24
26
30
31
33
34
37

Contents . . . . . . iii

Chapter 3: Experiences Before Prison . . . . . . . . . . . . . . . . . . . . . . . . .
Educational Experiences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Educational Attainment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Reason for Dropping Out of School . . . . . . . . . . . . . . . . . . . . . . .
Home Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Level of Parental Education . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Language Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Occupation and Income . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

39
39
39
41
41
41
45
46
48

Chapter 4: Experiences Unique to Prison Life . . . . . . . . . . . . . . . . .
Literacy Proficiency by Type of Offense . . . . . . . . . . . . . . . . . . . . . . .
Literacy and Length of Prison Sentence . . . . . . . . . . . . . . . . . . . . . . .
Participation in Educational and Vocational Programs . . . . . . . . . . .
Prison Work Experiences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Joining Groups While in Prison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

49
50
51
51
52
57
57

Chapter 5: Recidivism and Literacy . . . . . . . . . . . . . . . . . . . . . . . . . . .
Prior Sentences of Prison Inmates . . . . . . . . . . . . . . . . . . . . . . . . . . .
Number of Prior Sentences to Probation and/or Incarceration
and Literacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Recidivism and Educational Level of Prison Inmates . . . . . . . . . . . .
Recidivism and Race/Ethnicity of Prison Inmates . . . . . . . . . . . . . . .
Recidivism and Disabilities of Prison Inmates . . . . . . . . . . . . . . . . . .
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

59
60
62
64
64
67
69

Chapter 6: Comparing Literacy Practices and Self-Perceptions
of the Prison and Household Populations . . . . . . . . . . . . . . . . . . . .
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Reading Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Writing Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Arithmetic Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Reading Books . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Self-Perceptions of Ability to Perform Literacy Activities . . . . . . . . .
Collaboration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

71
71
72
73
75
77
81
82
85

Appendix A: Interpreting the Literacy Scales . . . . . . . . . . . . . . . . . .
Building the Literacy Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Defining the Literacy Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

87
87
89

iv . . . . . . Contents

Interpreting the Literacy Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Prose Literacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Document Literacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Quantitative Literacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Successful Task Performance Across the Literacy Levels . . . . . . .

91
91
102
111
117

Appendix B: Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

121

Appendix C: Overview of Procedures Used in the National Adult
Literacy Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Weighting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Background Questionnaires . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Literacy Assessment Booklets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Survey Design: BIB Spiralling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Training the Data Collection Staff . . . . . . . . . . . . . . . . . . . . . . . . . . .
Administering the Data Collection Instruments . . . . . . . . . . . . . . . .
Response Rates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Data Collection Quality Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Scoring the Literacy Exercise Booklets . . . . . . . . . . . . . . . . . . . . . . . .
Data Entry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Editing and Quality Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Scaling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Statistical Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

133
133
135
136
139
141
141
142
144
144
145
146
147
147
149

Appendix D: Definitions of All Subpopulations and
Variables Reported . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Prison Population . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Household Population . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Highest Level of Education Completed . . . . . . . . . . . . . . . . . . . . . . .
Average Years of Schooling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Race/Ethnicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Sex . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Age . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Presence and Type of Physical, Mental, or Other Health Condition
Reason for Dropping Out of School . . . . . . . . . . . . . . . . . . . . . . . . . .
Level of Parental Education . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Language Spoken in the Home While Growing Up . . . . . . . . . . . . .
Occupation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

151
151
151
151
152
152
153
153
153
154
154
154
155

Contents . . . . . . v

Monthly Income . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Current Offense . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Length of Sentence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Participation in Education and Vocational Programs . . . . . . . . . . . . .
Involvement in Prison Work Assignments . . . . . . . . . . . . . . . . . . . . .
Type of Work Assignment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Participation in Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Type of Groups Joined . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Number of Groups Joined . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Recidivism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Number of Prior Sentences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Reading, Writing, and Arithmetic Practices . . . . . . . . . . . . . . . . . . . .
Types of Books Read . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Self-Perceptions of Ability to Perform Literacy Activities . . . . . . . . .
Collaboration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

155
155
155
156
156
156
156
156
157
157
157
157
158
158
158

Appendix E: Participants in the Development Process and
Information About the Authors . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

159

vi . . . . . . Contents

Figures and Tables
Table 1.1:
Figure 1:
Figure 2:
Table 2.1:
Table 2.2:
Table 2.3:
Table 2.4:

Table 2.5:

Table 2.6:

Table 2.7:

Table 2.8:

Table 2.9:

Table 2.10:

Table 2.11:

Table 2.12:
Table 3.1:

Table 3.2:
Table 3.3:
Table 3.4:

The National Adult Literacy Survey Sample . . . . . . . . . . . . . . . . .
Difficulty Values of Selected Tasks Along the Prose,
Document, and Quantitative Literacy Scales . . . . . . . . . . . . . . . .
Description of the Prose, Document, and
Quantitative Literacy Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Adult Correctional Population, Year-End 1991 . . . . . . . . . . . . . . .
Percentages of Adults in Prison and Household Populations,
by Various Demographic Characteristics . . . . . . . . . . . . . . . . . . . .
Percentages at Each Level and Average Proficiencies on
Each Literacy Scale of Prison and Household Populations . . . . .
Percentages at Each Level and Average Proficiencies on
Each Literacy Scale of Prison and Household Populations,
by Educational Attainment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages at Each Level and Average Proficiencies on
Each Scale of Prison and Household Populations,
by Race/Ethnicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages of Prison and Household Populations Attaining
Each Education Level and Average Years of Schooling,
by Race/Ethnicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Average Proficiencies on Each Literacy Scale of
Prison and Household Populations, by Race/Ethnicity and
by Level of Education . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages at Each Level and Average Proficiencies on
Each Literacy Scale of Prison and Household Populations,
by Sex . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages at Each Level and Average Proficiencies on
Each Literacy Scale of Prison and Household Populations,
by Age . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Prison and Household Populations,
by Disability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Prison and Household Populations,
by Various Disabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Results of Multiple Regression Analyses . . . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literary Scale
of Inmates, by Level of Education and Reason for Dropping
Out of School . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages of Inmates and Householders Reporting Lower,
Equal, and Higher Levels of Education Than Their Parents . . . .
Percentages of Inmates and Householders Reporting Personal
Education Level and Parental Education Level . . . . . . . . . . . . . .
Average Proficiencies on Each Literacy Scale of Prison
and Household Populations, by Level of Parental Education . . . .

8
10
11
16
18
19

23

25

27

29

30

32

33

35
36

40
42
43
44

Contents . . . . . . vii

Table 3.5:

Table 3.6:
Table 3.7:

Table 4.1:
Table 4.2:
Table 4.3:

Table 4.4:
Table 4.5:
Table 4.6:
Table 5.1:
Table 5.2:
Table 5.3:
Table 5.4:
Table 5.5:
Table 6.1:
Table 6.2:

Table 6.3:
Table 6.4:

Table 6.5:

Table 6.6:

viii . . . . . . Contents

Average Proficiencies on Each Literacy Scale of Inmates
and Householders Reporting Same Level of Education
as Their Parents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Inmates, by Language Spoken in the Home . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates, by Occupation Category and Income Before
Incarceration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Inmates, by Current Offense . . . . . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Inmates, by Length of Sentence . . . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Inmates, by Participation in Education and/or
Vocational Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Inmates, by Work Experience in Prison . . . . . . . . . . . . . .
Percentages of Inmates Reporting Level of
Education, by Whether Working in Prison . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Inmates Reporting Groups Joined in Prison . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Inmates, by Recidivism . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Inmates, by Number of Times Recidivated . . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy
Scale of Inmates at Each Education Level Reporting Recidivism
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates Reporting Race/Ethnicity, by Recidivism . . . . . . . . . .
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates With or Without Disabilities Reporting Recidivism . .
Percentages of Prison and Household Populations Reporting
Frequency of Reading Materials in English . . . . . . . . . . . . . . . . .
Average Proficiencies on Literacy Scales of Prison and
Household Populations Reporting Frequency of Reading
Materials in English . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages of Prison and Household Populations Reporting
Frequency of Writing Materials in English . . . . . . . . . . . . . . . . . .
Average Proficiencies on Literacy Scales of Prison and
Household Populations Reporting Frequency of Writing
Materials in English . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages and Average Proficiencies on the Quantitative
Scale of Prison and Household Populations Reporting
Frequency of Using Arithmetic . . . . . . . . . . . . . . . . . . . . . . . . . . .
Percentages and Average Prose and Document Proficiencies
of Prison and Household Populations, by Types of Books Read . .

45
46

47
50
52

53
54
55
57
61
63
65
66
68
72

74
75

76

77
78

Table 6.7:

Percentages within Levels and Average Proficiencies on
Prose and Document Scales of Prison and Household Populations
Reporting Reading Books . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Table 6.8:
Percentages of Prison and Household Populations Reporting
Self-Perceptions of Ability to Perform Literacy Activities . . . . . . 81
Table 6.9:
Average Proficiencies of Prison and Household Populations
Reporting Self-Perceptions of Ability to Perform Literacy
Activities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Table 6.10: Percentages of Prison and Household Populations Reporting
Frequency of Getting Help With Various Tasks . . . . . . . . . . . . . . 83
Table 6.11: Average Proficiencies on Literacy Scales of Prison and
Household Populations, by Frequency of Help Received for
Various Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Figure A.1: Probabilities of Successful Performance on Two Prose Tasks,
by Individuals at Selected Points on the Prose Scale . . . . . . . . . . 90
Figure A.2: Average Probabilities of Successful Performance,
by Individuals with Selected Proficiency Scores on the
Tasks in Each Literacy Level . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Table A.1:
Percentages and Average Proficiencies of Adults on
Each Scale, by Assessment Completion Status . . . . . . . . . . . . . . . 123
Table A.2:
Percentages and Average Proficiencies on
Each Scale of Adults in Level 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Table A.3:
Percentages and Average Proficiencies of Adults
in Level 1 with at Least One Task Correct,
by Assessment Completion Status . . . . . . . . . . . . . . . . . . . . . . . . . 125
Table A.4:
Percentages and Average Proficiencies of Adults
in Level 1 with No Tasks Correct,
by Assessment Completion Status . . . . . . . . . . . . . . . . . . . . . . . . . 126
Table A.5P: Percentages of Adults in Selected Groups,
by Membership in Total U.S. Population, in Level 1,
and in Level 1 with No Tasks Correct . . . . . . . . . . . . . . . . . . . . . . 127
Table A.5D: Percentages of Adults in Selected Groups,
by Membership in Total U.S. Population, in Level 1,
and in Level 1 with No Tasks Correct . . . . . . . . . . . . . . . . . . . . . . 128
Table A.5Q: Percentages of Adults in Selected Groups,
by Membership in Total U.S. Population, in Level 1,
and in Level 1 with No Tasks Correct . . . . . . . . . . . . . . . . . . . . . . 129
Table B.1:
Average Years of Schooling of Inmates, by Number of Times
Recidivated . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
Table B.2:
Percentages of Inmates Reporting Number of Times on
Probation, by Employment Status . . . . . . . . . . . . . . . . . . . . . . . . . 131
Table C.1: Composition of the Item Pool for the
National Adult Literacy Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . 148

Contents . . . . . . ix

ACKNOWLEDGMENTS

W

e extend our appreciation to the many individuals who contributed to the
project and helped create this report on the results of the survey of the prison
population. Thanks are especially due the members of the Literacy Definition
Committee, the Technical Review Committee, the Literacy of Incarcerated
Adults Review Group, and the Literacy of Older Adults Review Group. The
members of these committees, whose names are listed in appendix E, guided
the project from beginning to end.
The National Adult Literacy Survey was a cooperative effort planned by
the National Center for Education Statistics and the Division of Adult
Education and Literacy of the U.S. Department of Education. Emerson Elliot,
commissioner, provided consistent support and guidance. Andrew Kolstad, the
project monitor, conscientiously guided the project and provided careful
reviews of project deliverables. We also thank Gary Phillips, Sue Ahmed, Joan
Seamon, and Ron Pugsley, who played crucial roles in the project.
Thanks are due our colleagues at Westat, Inc., for their outstanding work
in managing the complex sampling, data collection, and composite weighting
processes for the survey. We especially thank project director Martha Berlin,
senior statistician Joe Waksberg, statisticians Leyla Mohadjer and Jim Green,
field director Sue Rieger, and field managers Rich Hilpert, Merle Klein,
Judy Meader, and Cindy Randall. The hundreds of field supervisors and
interviewers who carried out the survey deserve special thanks for their efforts.
We are grateful to Renee Slobasky, senior vice president of Westat, for her
continuing support.
At Educational Testing Service, we wish to thank Sam Messick for serving
as corporate officer for the survey. Mary Michaels coordinated the committee
meetings, the publication of the assessment framework booklet, and other
aspects of the project, ensuring that the work proceeded smoothly.
Doug Rhodes coordinated the state adult literacy survey project as well as
printing and shipping operations for the national survey, assisted by Cathy
Shaughnessy. Jules Goodison provided senior guidance and support,
particularly in the operations process, and we are grateful to him for his many
contributions. We would also like to express our appreciation to Dave Hobson
for his good sense in financial and other matters.

Acknowledgments . . . . . . xi

Our thanks go to all those who carried out the enormous volume of
operations work — in particular Debbie Giannacio, who ably coordinated the
receipt of the survey materials, follow-up activities, and quality control. She
was assisted by Kathy Miller, who also provided administrative support for the
project. We acknowledge the contributions of Joanne Antonoff, who helped
prepare the NALS proposal and whose memory we cherish.
We thank Don Rock and Kentaro Yamamoto of Educational Testing
Service, who directed the statistical and psychometric activities and provided
invaluable guidance on technical matters. Robert Mislevy helped us reflect on
statistical as well as philosophical questions, and Charlie Lewis, Neal Thomas,
and Ming-Mei Wang were available for statistical advice.
Norma Norris deserves special honors for conducting the statistical work
and managing the data analysis under impossibly tight deadlines. Her support
went above and beyond the call of duty, and we are grateful for her dedication
to the project. Additional thanks are extended to Jim Ferris, Dave Freund,
Tom Jirele, Bruce Kaplan, Jennifer Nelson, Inge Novatkoski, Kate Pashley, and
Lois Worthington, who shared responsibility for conducting the data analyses.
Thanks to John Barone for helping to oversee the data analysis activities.
As this report took shape, many reviewers asked questions and provided
insights that helped us sharpen our thinking and writing. In particular, we wish
to thank Paul Barton, Archie Lapointe, and Kentaro Yamamoto for their
suggestions. We appreciate the thoughtful comments we received from the
government reviewers.
Beverly Cisney provided invaluable assistance in word processing and
desktop publishing for this report. Her commitment to the project is
appreciated.
Producing this report was made possible by the excellent work of the ETS
Publications Division. In particular, we thank Robin Matlack, who conceived of
the beautiful design for the report; Peter Stremic, who ensured the report met
the highest standards; Fred Marx, who managed the production stages; Patricia
B. Melvin, who coordinated the production process; and Kiyo Toma, who
provided technical assistance. We thank editor Shilpi Niyogi for her careful
review of the manuscript.
Finally, we wish to thank the thousands of adults across the country who
gave their time to respond to the survey. Without their participation, this study
would not have been possible.
Karl Haigler
Caroline Wolf Harlow

Contents
xii . . . . . . Acknowledgments

Patricia E. O’Connor
Anne Campbell

PREFACE

T

he United States has always been a mosaic of cultures, but the diversity of
our population has increased by striking proportions in recent years. As
Barbara Everitt Bryant, former director of the Bureau of the Census, has
written: “If you gave America a face in 1990, it would have shown the first sign
of wrinkles [and] it would have been full of color.”1 The median age of
Americans continues to rise, growing from 30 to almost 33 years during the
1980s. It is projected that by the year 2080, nearly 25 percent of the adults in
this nation will be over 65, compared with only about 12 percent today. The
racial and ethnic composition of the nation also continues to change. While
3.7 million people of Asian or Pacific Islander origin were living in this country
in 1980, there were 7.2 million a decade later — an increase of almost 100 percent.
The number of individuals of Hispanic origin also rose dramatically over this
time period, from roughly 6 to 9 percent of the population, or to more than
22 million people. Our increasing diversity can not only be seen but also be
heard: today, some 32 million individuals in the United States speak a language
other than English, and these languages range from Spanish and Chinese to
2
Yupik and Mon-Khmer.
Given these patterns and changes, this is an opportune time to explore the
literacy skills of adults in this nation. In 1988, the U.S. Congress called on the
Department of Education to support a national literacy survey of Americas
adults. While recent studies funded by the federal government explored the
literacy of young adults and job seekers, the National Adult Literacy Survey is
the first to provide accurate and detailed information on the skills of the adult
population as a whole information that, to this point, has been unavailable.
Perhaps never before have so many people from so many different sectors
of society been concerned about adult literacy. Numerous reports published in
1

B.E. Bryant. (1991). “The Changing Face of the United States.” The World Almanac and Book of Facts,
1991. New York, NY: Pharos Books. p. 72

2

United States Department of Commerce. (April 1993). “Number of Non-English Language Speaking
Americans Up Sharply in 1980s, Census Bureau Says.” United States Department of Commerce News.

Preface . . . . . . xiii

the last decade — including A Nation at Risk, The Bottom Line, The Subtle
Danger, Literacy: Profiles of America’s Young Adults, Jump Start: The Federal
Role in Adult Education, Workforce 2000, America’s Choice: High Skills or
Low Wages, and Beyond the School Doors — have provided evidence that a
large portion of our population lacks adequate literacy skills and have
intensified the debate over how this problem should be addressed.
Concerns about literacy are not new. In fact, throughout our nation’s
history there have been periods when the literacy skills of the population were
judged inadequate. Yet, the nature of these concerns has changed radically over
time. In the past, the lack of ability to read and use printed materials was seen
primarily as an individual problem, with implications for a person’s job
opportunities, educational goals, sense of fulfillment, and participation in
society. Now, however, it is increasingly viewed as a national problem, with
implications that reach far beyond the individual. Concerns about the human
costs of limited literacy have, in a sense, been overshadowed by concerns about
the economic and social costs.
Although Americans today are, on the whole, better educated and more
literate than any who preceded them, many employers say they are unable to
find enough workers with the reading, writing, mathematical, and other
competencies required in the workplace. Changing economic, demographic,
and labor-market forces may exacerbate the problem in the future. As a recent
study by the American Society for Training and Development concluded,
“ These forces are creating a human capital deficit that threatens U.S.
competitiveness and acts as a barrier to individual opportunities for all
Americans.” 3
Whether future jobs will have greater literacy requirements than today’s
jobs, or whether the gap between the nation’s literacy resources and its needs
will widen, are open questions. The evidence to support such predictions is
scarce. What many believe, however, is that our current systems of education
and training are inadequate to ensure individual opportunities, improve
economic productivity, or strengthen our nation’s competitiveness in the global
marketplace.
There is widespread agreement that we as a nation must respond to the
literacy challenge, not only to preserve our economic vitality but also to ensure
that every individual has a full range of opportunities for personal fulfillment
and participation in society. At the historic education summit in Charlottesville,
Virginia, the nation’s governors — including then-Governor Clinton — met
with then-President Bush to establish a set of national education goals that
would guide this country into the twenty-first century. As adopted in 1990 by
members of the National Governors Association, one of the six goals states:
3

A.P. Carnevale, L.J. Gainer, A.S. Meltzer, and S.L. Holland. (October 1988). “Workplace Basics: The Skills
Employers Want.” Training and Development Journal. pp. 20-30.

xiv . . . . . . Preface

By the year 2000, every adult American will be
literate and will possess the knowledge and skills
necessary to compete in a global economy and exercise
the rights and responsibilities of citizenship.
The following year, Congress passed the National Literacy Act of 1991,
the purpose of which is “to enhance the literacy and basic skills of adults, to
ensure that all adults in the United States acquire the basic skills necessary to
function effectively and achieve the greatest possible opportunity in their work
and in their lives, and to strengthen and coordinate adult literacy programs.”
But how should these ambitious goals be pursued? In the past, whenever
the population’s skills were called into question, critics generally focused on the
educational system and insisted that school reforms were necessary if the
nation were to escape serious social and economic consequences. Today,
however, many of those who need to improve their literacy skills have already
left school. In fact, it is estimated that almost 80 percent of the work force for
the year 2000 is already employed. Moreover, many of those who demonstrate
limited literacy skills do not perceive that they have a problem. Clearly, then,
the schools alone cannot strengthen the abilities of present and future
employees and of the population as a whole. A broad-based response seems
necessary.
To initiate such a response, we need more than localized reports or
anecdotal information from employers, public leaders, or the press; accurate
and detailed information about our current status is essential. As reading
researchers John Carroll and Jean Chall observed in their book Toward a
Literate Society, “any national program for improving literacy skills would have
to be based on the best possible information as to where the deficits are and
how serious they are.”4 Surprisingly, though, we have lacked accurate and
detailed information about literacy in our nation — including how many
individuals have limited skills, who they are, and the severity of their problems.
In 1988, Congress asked the U.S. Department of Education to address
this need for information on the nature and extent of adult literacy. In
response, the Department’s National Center for Education Statistics and
Division of Adult Education and Literacy called for a national household
survey of the literacy skills of adults in the United States. A contract was
awaeded to Educational Testing Service and a subcontract to Westat, Inc. to
design and conduct the National Adult Literacy Survey, results from which are
presented in these pages.

4

J.B. Carroll and J.S. Chall, eds. (1975). Toward a Literate Society: A Report from the National Academy of
Education. New York, NY: McGraw Hill. p. 11.

Preface . . . . . . xv

During the first eight months of 1992, trained staff conducted household
interviews with nearly 13,600 individuals aged 16 and older who had been
randomly selected to represent the adult population in this country. In addition,
some 1,100 inmates from 80 federal and state prisons were interviewed to gather
information on the skills of the prison population. Finally, approximately
1,000 adults were surveyed in each of 12 states that chose to participate in a
special study designed to produce state-level results that are comparable to the
national data. Each individual was asked to spend about an hour responding
to a series of diverse literacy tasks and providing information on his or her
background, education, labor market experiences, and reading practices.
The results of the National Adult Literacy Survey comprise an enormous
set of data that includes more than a million responses to the literacy tasks and
background questions. More important than the size of the database, however,
is the fact that it provides information that was previously unavailable —
information that is essential to understanding this nation’s literacy resources.
To ensure that the survey results will reach a wide audience, the
committees that guided the project recommended that the findings be issued
in a series of reports. This volume discusses the results for the prison
population. The series also includes a report that provides an overview of the
results of the survey as well as additional reports that offer a more detailed look
at particular issues, including:
y literacy in the work force
y literacy and education
y literacy among older adults
y literacy and cultural diversity
y literacy practices
A final report conveys technical information about the survey design and
the methods used to implement it.
Although these reports focus almost exclusively on the results of the
National Adult Literacy Survey, their contents have much broader implications.
The rich collection of information they contain can be used to inform policy
debates, set program objectives, and reflect on our society’s literacy resources
and needs.
Irwin S. Kirsch
Project Director

xvi . . . . . . Preface

EXECUTIVE SUMMARY

T

his is one in a series of reports that look at the results of the National Adult
Literacy Survey, a project funded by the U.S. Department of Education and
administered by Educational Testing Service, in collaboration with Westat, Inc.
This report, in particular, provides an in-depth look at the literacy skills of
prisoners incarcerated in state and federal prisons.
Many past studies of adult literacy have tried to count the number
of “illiterates” in this nation, thereby treating literacy as a condition that
individuals either do or do not have. We believe that such efforts are inherently
arbitrary and misleading. They are also damaging in that they fail to
acknowledge both the complexity of the literacy problem and the range of
solutions needed to address it.
The National Adult Literacy Survey is based on a different concept of
literacy and, therefore, takes a different approach to measuring it. The aim of
this survey is to profile the English literacy of adults in the United States,
including prison inmates, based on their performance across a wide array
of tasks that reflect the types of materials and demands they encounter in their
daily lives.
To gather the information on the literacy skills of inmates, trained staff
interviewed nearly 1,150 inmates in 80 federal and state prisons. The prisons
were randomly selected to represent prisons across the country, and the
inmates themselves were randomly selected from the each of the prisons.
In addition, as the main part of the data collection, about 13,600 adults aged
16 and older residing in households were interviewed across the country. These
adults were also randomly selected to represent the adult population of the
nation as a whole. Finally, about an additional 1,000 adults were surveyed in
each of 11 states that chose to participate in a special study designed to provide
state-level results that are comparable to the national data. In total, 26,000
adults participated in the survey.
Each survey participant spent approximately one hour responding to a set
of diverse literacy tasks as well as to questions about his or her demographic

Executive Summary . . . . . . xvii

characteristics, educational background, reading practices, and other areas
related to literacy. As a result of their responses to the literacy tasks, adults
received proficiency scores on three scales that reflect varying degrees of skill
in prose, document, and quantitative literacy. The scales make it possible to
profile adults in various subpopulations of interest and to describe their
demonstrated levels of performance.
This report describes the types and levels of literacy skills demonstrated
by prison inmates in this country and compares them with the skills of the
household population of adults. It also explores the relationship between
literacy skills and the background characteristics and prison experiences of
inmates as well as their literacy practices and self-perceptions. Some of the
major findings are highlighted here.

The Literacy Skills of Inmates
y About 7 in 10 prisoners perform in Levels 1 and 2 on the prose, document,
and quantitative scales. These prisoners are apt to experience difficulty in
performing tasks that require them to integrate or synthesize information
from complex or lengthy texts or to perform quantitative tasks that involve
two or more sequential operations and that require the individual to set up
the problem.
y The average proficiencies of the prison population are 246 on the prose
scale, 240 on the document scale, and 236 on the quantitative scale. Their
proficiencies are substantially lower than those of the household population,
whose proficiencies average 273 on the prose scale, 267 on the document
scale, and 271 on the quantitative scale.
y The racial/ethnic composition and educational attainment of the prison
population differ from those of the household population. About 65 percent
of prisoners are minorities versus 24 percent of the household population.
About 51 percent of prisoners have completed at least high school or its
equivalent, compared with 76 percent of the household population. These
differences in demographic composition help to explain the lower average
performance of inmates as compared with householders.
y Educational attainment is highly related to literacy proficiency. Prisoners
who have not received a high school diploma or GED demonstrate lower
levels of proficiency than those who have completed high school, earned a
GED, or received some postsecondary education.

xviii . . . . . . Executive Summary

y Inmates who received a GED demonstrate about the same proficiencies as
householders with a GED. In contrast, inmates with a high school diploma
demonstrate lower proficiencies than householders with a high school
diploma.
y On all three literacy scales, White inmates demonstrate higher average
proficiencies than Black inmates, who, in turn, demonstrate higher
proficiencies than Hispanic inmates. The average proficiencies of White
prisoners are lower than those of White householders. Black and Hispanic
prisoners, however, generally demonstrate about the same proficiencies as
their counterparts in the household population.
y When the prison and household populations are compared by educational
attainment and race/ethnicity, prisoners generally perform as well as or
better than their counterparts in the household population. White, Black,
and Hispanic inmates without a high school diploma perform better than
their counterparts in the household population. White and Black prisoners
with a high school diploma or GED demonstrate about the same skills as
their counterparts among householders with the same education. Black
inmates with at least some postsecondary education perform about the same
as their household counterparts, while White inmates with at least some
postsecondary education demonstrate lower proficiencies on the prose and
quantitative scales and comparable proficiency on the document scale.
y Male and female prisoners do not perform differently from each other on
the literacy scales. Both male and female prisoners demonstrate lower
proficiencies on all three scales than their household counterparts.
y Thirty-six percent of prisoners reported having at least one disability,
compared with 26 percent of the household population. Significantly more
inmates than householders reported having a learning disability or a mental
or emotional condition. The proficiencies of inmates with a learning
disability are significantly lower than those of inmates reporting most other
disabilities and are also lower than those of householders reporting a
learning disability.
y When the variables of sex, race/ethnicity, age, and level of education are held
constant, the performance of the prison population on the three scales is
comparable to that of the household population. Thus, differences in overall
performance between the prison and household populations may be
attributed to differences in demographic composition and educational
attainment.

Executive Summary . . . . . . xix

Experiences Prior to Prison
y Prisoners, in general, attain lower levels of education than their parents. For
example, 49 percent of prisoners reported not having a high school diploma
or GED, compared with 36 percent of their parents. Furthermore, overall,
39 percent of prisoners attained lower levels of education than their parents,
compared with 21 percent of householders who attained less education than
their parents.
y Generally, the higher the level of parental education, the higher the
prisoners’ proficiencies. When compared with the household population by
level of parental education, however, the prison population demonstrates
lower proficiencies than the household population. This may be attributable,
in part, to the tendency for the inmate population to have lower levels of
education than both their parents and the household population.
y Inmates who come from homes where only a non-English language was
spoken demonstrate significantly lower proficiencies than those who come
from homes where English was spoken. The proficiencies of these inmates
from a non-English language background range from 165 to 180 on the three
scales and indicate that they demonstrate skills associated with only the most
basic literacy tasks.

Prison Experiences
y Over 60 percent of the inmate population reported being involved in
education and/or vocational programs in prison. Generally those
participating only in vocational programs demonstrate higher proficiencies
than those not involved in any programs and those involved in only education
classes or in both education and vocational classes.
y Almost 70 percent of inmates reported working in prison, and another
53 percent reported joining at least one group. Prisoners who either work
or are involved in groups in prison demonstrate higher proficiencies than
those who do not work or join groups.

Recidivism
y The literacy proficiencies of repeat offenders do not differ from those of
first-time offenders. In addition, the proficiencies with respect to recidivism
do not differ when comparisons are made by levels of education, race/
ethnicity, or by the presence or absence of disabilities.

xx . . . . . . Executive Summary

Literacy Practices and Self-Perception
y The literacy practices that more inmates than householders reported doing
frequently — that is, every day or a few times a week — include reading
and writing letters or memos and reading and writing reports or articles.
Inmates who reported reading any material frequently demonstrate higher
proficiencies than inmates who reported reading less than once a week.
y More inmates than householders reported reading a book within the last six
months — 89 percent compared with 83 percent. The types of books most
frequently read by inmates are fiction, reference, and inspirational and
religious, while the books most frequently read by householders are manuals,
reference, and fiction.
y Inmates do not appear to have as high an opinion of their reading, writing,
and arithmetic skills as do householders. Slightly over half the inmates
reported that they read or write English very well compared with 71 and
64 percent of householders who said they read or write English very well.
Forty percent of inmates, compared with 53 percent of householders, said
they can do arithmetic very well. The proficiencies of inmates who said they
read, write, or do arithmetic very well or well are lower than those of their
counterparts in the household population. There are no appreciable
differences in the proficiencies of inmates and householders who said they
do not read or write well, but proficiencies of inmates who said they do not
do arithmetic well are lower than those of their household counterparts.
y In spite of their overall lower proficiencies, more inmates than householders
reported getting no help with such activities as filling out forms, reading
newspapers or other written information, reading information from agencies
and companies, and writing letters. This may be due more to the prison
environment not being conducive to seeking help rather than to the inmates
not perceiving that they need help. The proficiencies of inmates who get a
lot of help with these activities are lower than those of inmates who get less
help. Furthermore, inmates who get a lot of help with these activities, with
the exception of writing letters, demonstrate lower proficiencies than do
their household counterparts.

Reflections
In this section, we reflect on the literacy skills of the prison population and
discuss implications for those who administer, work, or volunteer in programs
that directly affect prisoner learning. These include not only prison

Executive Summary . . . . . . xxi

administrators, educational and vocational staff, literacy volunteers, and social
group organizers, but also prisoners who help their fellow inmates to advance
their literacy skills.
y Inmates having a high school diploma should not be viewed as necessarily
possessing the literacy skills needed to function in society, given that their
performance is lower than that of householders with a high school diploma.
Inmates who have a high school diploma, as well as inmates with no diploma,
need opportunities to improve their literacy skills.
y One important aspect of being literate in our society is possessing the
knowledge and skills to process information found in documents.1 Reading
and using documents are not only important in our personal lives, but are
also a necessary part of managing a household and performing on the job.
Research has shown that adults spend more time reading documents than
any other type of material.2 Yet from 69 to 81 percent of inmates reported
that they read documents such as directions, instructions, diagrams, bills,
and invoices less than once a week. In addition, more inmates reported
getting help with documents such as forms and printed information from
agencies, companies, and businesses than with other kinds of materials.
Given the importance of documents in our society and inmates’ relative lack
of exposure to them, it would seem that incorporating documents into prison
education and vocational programs, job assignments, and group activities is
important if inmates are to possess the skills needed to succeed once they
are released from prison.
y Even though about the same percentage of inmates as householders
reported getting help doing arithmetic, a greater percentage of inmates
than householders reported that they are not able to do arithmetic well.
This, coupled with the comparatively low performance of inmates on the
quantitative scale, would indicate that the quantitative skills of inmates need
improvement. That inmates do not get a lot of help with arithmetic may be
attributed to the lack of opportunity to use arithmetic in prison life. One
response could be to integrate the use of arithmetic and mathematics into a
variety of prison experiences.

1

I.S. Kirsch and P.B. Mosenthal. (1990). “Exploring Document Literacy: Variables underlying the
performance of young adults. Reading Research Quarterly, 25, pp. 5-30.

2

J.T. Guthrie, M. Seifert, and I.S. Kirsch. (1986). “Effects of Education, Occupation, and Setting on Reading
Practices.” American Educational Research Journal, 23, pp. 151-60.

xxii . . . . . . Executive Summary

y As reported in Adult Literacy in America: A First Look at the Results of the
National Adult Literacy Survey,3 individuals demonstrating lower levels of
literacy were more likely to be out of the labor force. According to this
report, from 34 to 53 percent of adults in Levels 1 and 2 were out of the
work force at the time of the survey. Given that over two-thirds of the
inmates demonstrate performance in Levels 1 and 2, their prospects for
being employed upon release from prison are diminished, unless their skills
can be improved considerably. This same report also revealed that adults
who perform in the highest levels are much more likely to report holding
managerial, professional, or technical jobs than are respondents who
perform in the lowest levels. Adults in the two lowest levels were more likely
to be employed in craft, service, labor, assembly, farming, or fishing
occupations. It is, therefore, not surprising that few inmates reported
holding professional jobs prior to incarceration.
y Eleven percent of prisoners reported having learning disabilities, compared
with only 3 percent of the general population. These inmates scored at the
very low end of the three literacy scales and their demonstrated proficiencies
indicate that they are able to perform only the most basic literacy tasks. The
fact that learning disabled people are disproportionately represented in the
prison population underscores the need for accommodating learning
disabilities and developing methods tailored for the learning disabled in
prison learning situations.
The national goal that all of America’s adults be literate by the year 2000
includes those adults incarcerated in prison. Given the results reported, literacy
programs for inmates cannot afford to be short changed. Prisons should not be
expected, however, to shoulder all the responsibility; individuals, groups,
organizations, schools, colleges, and businesses need to reach behind prison
walls with efforts aimed at improving the literacy skills of inmates. It will take a
comprehensive strategy, the purpose of which should be to prepare the whole
person for succeeding in the world beyond prison walls.

3

I.S. Kirsch, A. Jungeblut, L. Jenkins, and A. Kolstad. (September 1993). Adult Literacy in America: A First
Look at the Results of the National Adult Literacy Survey. Washington, DC: US Department of Education.

Executive Summary . . . . . . xxiii

xxiv . . . . . . Executive Summary

CHAPTER 1
Overview*

L

iteracy and education are keys to opportunity in this society, and perhaps
no one realizes this more clearly than prisoners. An inmate in a maximum
security prison reflected on the importance of learning and literacy in this way.
“When I first came [to prison] I had a negative attitude. I didn’t write. I didn’t
want to go to school. I didn’t think it mattered.” His views were changed,
however, by another prisoner who was involved in postsecondary education.
“He tried to show me how education would help me inside, even more than in
the eyes of someone else,” this prisoner said. His life in prison changed once he
began to take classes. “It made me feel good about myself and gave me hope as
to what I could be.” He has since earned a General Educational Development
(GED) certificate and is now taking college courses.
The results of the National Adult Literacy Survey make it possible, for the
first time, to take an in-depth look at the literacy proficiencies of the prison
population and at the relationships between literacy and individuals’
characteristics and experiences. This large-scale survey, conducted in 1992,
grew out of the Adult Education Amendments of 1988, in which the U.S.
Congress called upon the Department of Education to report on the definition
of literacy and on the nature and extent of literacy among adults in the nation.
In response, the department’s National Center for Education Statistics
(NCES) and the Division of Adult Education and Literacy planned a national
household survey of adult literacy. In September 1989, NCES awarded a fouryear contract to Educational Testing Service (ETS) to design and administer
the survey as well as to analyze and report the results. A subcontract was given
to Westat, Inc. for sampling and field operations.
As part of the contract, the survey was to include persons incarcerated in
prison in addition to those living in households. The participation of prisoners
in the survey would help to provide better estimates of the literacy levels of the

*Portions of this chapter originally appeared in the first report on the National Adult Literacy Survey, I.S.
Kirsch, A. Jungeblut, L. Jenkins, and A. Kolstad. (September 1993). Adult Literacy in America: A First
Look at the Results of the National Adult Literacy Survey. Washington, DC: US Department of Education.

Chapter 1 . . . . . . 1

total population and would make it possible to report on the literacy
proficiencies of this important segment of our society.
The plan for developing and conducting the survey was guided by a panel
of experts from business and industry, labor, government, research, and adult
education. This Literacy Definition Committee worked with ETS staff to
prepare a definition of literacy that would guide the development of the
assessment objectives as well as the selection and construction of assessment
tasks. A second panel, the Technical Review Committee, was formed to help
ensure the soundness of the assessment design, the quality of the data
collected, the integrity of the analyses conducted, and the appropriateness
of the interpretations of the final results.
This introduction summarizes the discussions that led to the adoption of a
definition of literacy for the National Adult Literacy Survey, the framework
used in designing the survey instruments, the populations assessed, the survey
administration, and the methods for reporting the results.

Defining and Measuring Literacy
The National Adult Literacy Survey is the third and largest assessment of adult
literacy funded by the federal government and conducted by ETS. The two
previous efforts included a 1985 household survey of the literacy skills of
21- to 25-year-olds, funded by the U.S. Department of Education, and a
1989-90 survey of the literacy proficiencies of job seekers, funded by the U.S.
Department of Labor.1 The definition of literacy that guided the National Adult
Literacy Survey was rooted in these preceding studies.
Building on earlier work in large-scale literacy assessment, the 1985 young
adult survey attempted to extend the concept of literacy, to take into account
some of the criticisms of previous surveys, and to benefit from advances in
educational assessment methodology. The national panel of experts assembled
to construct a definition of literacy for that survey rejected the types of
arbitrary standards — such as signing one’s name, completing five years of
school, or scoring at a particular grade level on a school-based measure of
reading achievement — that have long been used to make judgements about
adults’ literacy skills. Through a consensus process, this panel drafted the
following definition of literacy, which helped set the framework for the young
adult survey:

1

I.S. Kirsch and A. Jungeblut. (1986). Literacy: Profiles of America’s Young Adults. Princeton, NJ:
Educational Testing Service. I.S. Kirsch, A. Jungeblut, and A. Campbell. (1992). Beyond the School Doors:
The Literacy Needs of Job Seekers Served by the U.S. Department of Labor. Princeton, NJ: Educational
Testing Service.

2 . . . . . . Chapter 1

Using printed and written information to function in
society, to achieve one’s goals, and to develop one’s
knowledge and potential.
Unlike traditional definitions of literacy, which focused on decoding and
comprehension, this definition encompasses a broad range of skills that adults
use in accomplishing the many different types of literacy tasks associated with
work, home, and community contexts. This new perspective is shaping not only
adult literacy assessment, but policy as well. For example, the National Adult
Literacy Act of 1991 defined literacy as “an individual’s ability to read, write,
and speak in English and compute and solve problems at levels of proficiency
necessary to function on the job and in society, to achieve one’s goals, and to
develop one’s knowledge and potential.”
The definition of literacy from the young adult survey was adopted by the
panel that guided the development of the 1989-90 survey of job seekers, and it
also provided the starting point for the discussions of the Literacy Definition
Committee of the National Adult Literacy Survey. This committee agreed that
expressing the literacy proficiencies of adults in school-based terms or gradelevel scores is inappropriate. In addition, while the committee recognized the
importance of teamwork skills, interpersonal skills, and communication skills
for functioning in various contexts such as the workplace, it was decided that
these areas are not part of literacy per se and therefore should not be
incorporated into the definition of literacy guiding the survey.
Further, the committee endorsed the notion that literacy is neither a
single skill suited to all types of texts, nor an infinite number of skills, each
associated with a given type of text or material. Rather, as suggested by the
results of the young adult and job seeker surveys, an ordered set of skills
appears to be called into play to accomplish diverse types of tasks. Given this
perspective, the committee agreed to adopt not only the definition of literacy
that was used in the previous surveys, but also the three scales developed as
part of those efforts:
Prose literacy — the knowledge and skills needed to understand
and use information from texts that include editorials, news stories,
poems, and fiction; for example, finding a piece of information in a
newspaper article, interpreting instructions from a warranty,
inferring a theme from a poem, or contrasting views expressed in an
editorial
Document literacy — the knowledge and skills required to locate
and use information contained in materials that include job
applications, payroll forms, transportation schedules, maps, tables,
and graphs; for example, locating a particular intersection on a street
map, using a schedule to choose the appropriate bus, or entering
information on an application form
Chapter 1 . . . . . . 3

Quantitative literacy — the knowledge and skills required to apply
arithmetic operations, either alone or sequentially, using numbers
embedded in printed materials; for example, balancing a checkbook,
figuring out a tip, completing an order form, or determining the
amount of interest from a loan advertisement
The literacy scales provide a useful way to organize a broad array of tasks
and to report the assessment results. They represent a substantial improvement
over traditional approaches to literacy assessment, which have tended to report
on performance in terms of single tasks or to combine the results from diverse
tasks into a single, conglomerate score. Such a score fosters the simplistic notion
that “literates” and “illiterates” can be neatly distinguished from one another
based on a single cutpoint on a single scale. The literacy scales, on the other
hand, make it possible to explore the various types and levels of literacy among
different subgroups in our society. In so doing, they help us to understand the
diverse information-processing skills associated with the broad range of printed
and written materials that adults read and their many purposes for reading them.
In adopting the three scales for use in the National Adult Literacy Survey,
the committees aim was not to establish a single national standard for literacy.
Rather, it was to provide an interpretive scheme that would enable levels of
prose, document, and quantitative performance to be identified and allow
descriptions of the knowledge and skills associated with each level to be developed.
The prose, document, and quantitative scales were built initially to report
on the results of the young adult survey and were augmented in the survey of
job seekers. The Literacy Definition Committee recommended that a new set
of literacy tasks be developed to enhance the scales. These tasks would take
into account the following, without losing the ability to compare the results
with those of the earlier surveys:
y continued use of open-ended simulation tasks
y continued emphasis on tasks that measure a broad range of informationprocessing skills and cover a wide variety of contexts
y increased emphasis on simulation tasks that require brief written and/or oral
responses
y increased emphasis on tasks that ask respondents to describe how they
would set up and solve a problem
y the use of a simple, four-function calculator to solve selected quantitative
problems

4 . . . . . . Chapter 1

Approximately 110 new assessment tasks were field tested, and 80 of these
were selected for inclusion in the survey, in addition to 85 tasks that were
administered in both the young adult and job-seeker assessments. By administering
a common set of simulation tasks in each of the three literacy surveys, it is
possible to compare results across time and across population groups.
A large number of literacy tasks had to be administered to ensure that the
survey would provide good estimates of the literacy proficiencies of the adult
population. Yet, no individual could be expected to respond to the entire set of
165 simulation tasks. Accordingly, the survey was designed to give each person
participating in the study a subset of the total pool of literacy tasks, while at the
same time ensuring that each of the 165 tasks was administered to a nationally
representative sample of adults. Literacy tasks were assigned to sections that
could be completed in about 15 minutes, and these sections were then compiled
into booklets, each of which could be completed in about 45 minutes. During a
personal interview, each participant was asked to complete one booklet.
In addition to the time allocated for the literacy tasks, approximately
20 minutes were devoted to obtaining background and personal information
from respondents. Two versions of the background questionnaire were
administered, one in English and one in Spanish. Major areas explored
included: background and demographics — country of birth, languages spoken
or read, access to reading materials, size of household, educational attainment
of parents, age, race/ethnicity, and marital status; education — highest grade
completed in school, current aspirations, participation in adult education
classes, and education received outside the country; labor market experiences
— employment status, recent labor market experiences, and occupation; income
— personal as well as household; and activities — voting behavior, hours spent
watching television, frequency and content of newspaper reading, and use of
literacy skills for work and leisure. These background data make it possible to
gain an understanding of the ways in which personal characteristics are
associated with demonstrated performance on each of the three literacy scales.2

Conducting the Survey
The National Adult Literacy Survey was conducted during the first eight months of
1992 with a nationally representative sample of some 13,600 adults residing in
households. More than 400 trained interviewers, some of who were bilingual
in English and Spanish, visited nearly 27,000 households to select and interview
adults ages 16 and older, each of whom was asked to provide personal and
2

A more detailed description of the design and framework can be found in an interim report, Assessing
Literacy: The Framework for the National Adult Literacy Survey. Washington, DC: National Center for
Education Statistics, October 1992 .

Chapter 1 . . . . . . 5

background information and to complete a booklet of literacy tasks. Black and
Hispanic households were oversampled to ensure reliable estimates of literacy
proficiencies and to permit analyses of the performance of different
subpopulations. Those in the household population who agreed to participate
in the survey and completed as much of the assessment as their skills allowed
were paid $20 for their time in order to maximize response rates.
In addition, more than 1,100 inmates from some 80 federal and state
prisons were included in the survey. To ensure comparability with the national
survey, the simulation tasks given to the prison participants were the same as
those given to the household survey population. To address issues of particular
relevance to the prison population, a special version of the background
questionnaire was developed. This instrument drew questions from the 1991
Survey of State Prison Inmates sponsored by the Bureau of Justice Statistics of
the U.S. Department of Justice. These included queries about current offenses,
criminal history, and prison work assignments, as well as about education and
labor force experiences. A certificate of participation was given to all inmates
who completed the survey. It was not possible, as originally planned, to pay
inmates a $20 incentive in order to ensure that response rates for the prison
sample would be as high as those for the household sample. Many prisons have
regulations against prisoners receiving money. Furthermore, response rates for
prison surveys are typically high. Thus, a certificate of participation was
substituted for the monetary incentive.
Finally, to give the states an opportunity to explore the skill levels of their
populations, each of the 50 states was invited to participate in a concurrent
assessment. While many states expressed an interest, 11 elected to participate
in the State Adult Literacy Survey. Approximately 1,000 adults aged 16 to 64
were surveyed in each of the following states:
California
Illinois
Indiana
Iowa

Louisiana
New Jersey
New York
Ohio

Pennsylvania
Texas
Washington

To permit comparisons of the state and national results, the survey instruments
administered to the state and national samples were identical and the data were
gathered at the same time. A twelfth state, Florida, also participated, but its
survey was unavoidably delayed until 1993.
Responses from the national household, the prison, and state samples
were combined to yield the best possible performance estimates. In all, more
than 26,000 adults gave, on average, more than an hour of their time to complete
the instruments. Unfortunately, because of the delayed administration, the

6 . . . . . . Chapter 1

results from the Florida survey could not be included in the national estimates.
The assessed sample size and corresponding national population size are
presented in table 1.1 by the demographic characteristics of the adults who
participated in the survey.
Further information on the design of the sample, the survey administration,
the statistical analyses and special studies that were conducted, and the validity
of the literacy scales will be available in a forthcoming technical report, to be
published in 1994.

Reporting the Results
The results of the National Adult Literacy Survey are reported using three
scales, each ranging from 0 to 500: a prose scale, a document scale, and a
quantitative scale. The scores on each scale represent degrees of proficiency
along that particular dimension of literacy. For example, a low score (below
225) on the document scale indicates that an individual has very limited skills in
processing information from tables, charts, graphs, maps, and the like (even
those that are brief and uncomplicated). On the other hand, a high score
(above 375) indicates advanced skills in performing a variety of tasks that
involve the use of complex documents.
Survey participants received proficiency scores according to their performance
on the survey tasks. A relatively small proportion of the respondents answered
only a part of the survey, and a statistical procedure was used to make the best
possible estimates of their proficiencies. This procedure and related issues are
detailed in the technical report.
Most respondents tended to obtain similar, though not identical, scores on
the three literacy scales. This does not mean, however, that the underlying
skills involved in prose, document, and quantitative literacy are the same. Each
scale provides some unique information, especially when comparisons are made
across groups defined by variables such as race/ethnicity, education, and age.
The literacy scales allow us not only to summarize results for various
subpopulations, but also to determine the relative difficulty of the literacy tasks
included in the survey. In other words, just as individuals receive scale scores
according to their performance in the assessment, the literacy tasks receive
specific scale values according to their difficulty, as determined by the
performance of the adults who participated in the survey. Previous research has
shown that the difficulty of a literacy task, and therefore its placement on the
literacy scale, is determined by three factors: the structure of the material —
for example, exposition, narrative, table, graph, map, or advertisement; the

Chapter 1 . . . . . . 7

Table 1.1
The National Adult Literacy Survey Sample
Total Population*
Assessed
sample
26,091

National
population
191,289,250

Sex
Male
Female

11,770
14,279

92,098,158
98,900,965

Age
16 to 18 years
19 to 24 years
25 to 39 years
40 to 54 years
55 to 64 years
65 years and older

1,237
3,344
10,050
6,310
2,924
2,214

10,423,866
24,514,789
63,277,808
43,794,468
19,503,078
29,735,489

Race/Ethnicity
White
Black
Hispanic/Mexican
Hispanic/Puerto Rican
Hispanic/Cuban
Hispanic/Central or South American
Hispanic/Other
Asian or Pacific Islander
American Indian or Alaskan Native
Other

17,292
4,963
1,776
405
147
424
374
438
189
83

144,967,759
21,192,151
10,234,806
2,190,094
928,116
2,607,829
2,520,468
4,116,356
1,802,724
728,948

Total Population

Prison Population
Assessed
sample

National
population

Total

1,147

765,651

Sex
Male
Female

1,076
71

722,632
43,019

417
480
211
7
27
5

265,602
340,308
134,048
4,106
17,758
3,829

Race/Ethnicity
White
Black
Hispanic
Asian or Pacific Islander
American Indian or Alaskan Native
Other

*The total population includes adults living in households and those in prison. The sample sizes for
subpopulations may not add up to the total sample sizes because of missing data.
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey,
1992.

8 . . . . . . Chapter 1

content of the materials and/or the context from which it is drawn — for example,
home, work, or community; and the nature of the task — that is, what the
individual is asked to do with the material, or his or her purpose for using it.3
The literacy tasks administered in the NALS varied widely in terms of
materials, content, and task requirements, and thus in terms of difficulty. This
range is captured in figure 1, which describes some of the literacy tasks and
indicates their scale values.
Even a cursory review of this display reveals that tasks at the lower end
of each scale differ from those at the high end. A more careful analysis of the
range of tasks along each scale provides clear evidence of an ordered set of
information-processing skills and strategies. On the prose scale, for example,
tasks with low scale values tend to ask readers to locate or identify information
in brief, familiar, or uncomplicated materials, while those at the high end ask
them to perform more demanding activities using materials that tend to be
lengthy, unfamiliar, or complex. Similarly, on the document and quantitative
scales, the tasks at the low end of the scale differ from those at the high end in
terms of the structure of the materials, the content and context of the material,
and the nature of the directive.
In an attempt to capture this progression of information-processing
skills and strategies, each scale was divided into five levels: Level 1 (0 to 225),
Level 2 (226 to 275), Level 3 (276 to 325), Level 4 (326 to 375), and Level 5
(376 to 500). The points and score ranges that separate these levels on each
scale reflect shifts in the literacy skills and strategies required to perform
increasingly complex tasks. The survey tasks were assigned to the appropriate
point on the appropriate scale based on their difficulty as reflected in the
performance of a nationally representative sample of adults surveyed. Analyses
of the types of materials and demands that characterize each level reveal the
progression of literacy demands along each scale (figure 2). (See appendix A for
a detailed discussion of the levels for each scale.)
While the literacy levels on each scale can be used to explore the range of
literacy demands, these data do not reveal the types of literacy demands that
are associated with particular contexts in this pluralistic society. That is, they do
not enable us to say what specific level of prose, document, or quantitative skill
is required to obtain, hold, or advance in a particular occupation, to manage a
household, or to obtain legal or community services, for example. Nevertheless,
the relationships among performance on the three scales and various social or
economic indicators can provide valuable insights.
3

I.S. Kirsch and P.B. Mosenthal. (1990). “Exploring Document Literacy: Variables Underlying the
Performance of Young Adults,” Reading Research Quarterly, 25, pp. 5-30. P.B. Mosenthal and I.S. Kirsch.
(1992). “Defining the Constructs of Adult Literacy,” paper presented at the National Reading Conference,
San Antonio, Texas.

Chapter 1 . . . . . . 9

Figure 1

NALS

Difficulty Values of Selected Tasks Along the Prose, Document, and Quantitative Literacy Scales
Prose
0

Document

Identify country in short article

69

210

Locate one piece of information
in sports article

151

Locate expiration date on driver's license

224

180

Locate time of meeting on a form

Underline sentence explaining action
stated in short article

214

Using pie graph, locate type of vehicle
having specific sales

232

149

Sign your name

Quantitative
191

Total a bank deposit entry

Locate intersection on a street map

238

Calculate postage and fees for
certified mail

245

Locate eligibility from table of
employee benefits

246

Determine difference in price between
tickets for two shows

259

Identify and enter background
information on application for social
security card

270

Calculate total costs of purchase from
an order form

225
226

250

275

Underline meaning of a term given in
government brochure on supplemental
security income
Locate two features of information in
sports article

275

Interpret instructions from an appliance
warranty

277

Identify information from bar graph
depicting source of energy and year

278

Using calculator, calculate difference
between regular and sale price from an
advertisement

280

Write a brief letter explaining error
made on a credit card bill

296

Use sign out sheet to respond to call
about resident

308

304

Read a news article and identify
a sentence that provides interpretation
of a situation

Using calculator, determine the
discount from an oil bill if paid
within 10 days

314

Use bus schedule to determine
appropriate bus for given set
of conditions

316

Read lengthy article to identify two
behaviors that meet a stated condition

323

Enter information given into an
automobile maintenance record form

328

State in writing an argument made in
lengthy newspaper article

342

Identify the correct percentage meeting
specified conditions from a table of such
information

325

Plan travel arrangements for meeting
using flight schedule

347

Explain difference between two types
of employee benefits

331
348

Use bus schedule to determine
appropriate bus for given set
of conditions

Determine correct change using
information in a menu

350

Using information stated in news article,
calculate amount of money that should
go to raising a child

368

Using eligibility pamphlet, calculate the
yearly amount a couple would receive
for basic supplemental security income

325

359

Contrast views expressed in two
editorials on technologies available to
make fuel-efficient cars

362

Generate unfamiliar theme from short
poems

374

Compare two metaphors used in poem

382

Compare approaches stated in
narrative on growing up

379

Use table of information to determine
pattern in oil exports across years

375

Calculate miles per gallon using
information given on mileage record
chart

410

Summarize two ways lawyers may
challenge prospective jurors

387

Using table comparing credit cards,
identify the two categories used and write
two differences between them

382

Determine individual and total costs on
an order form for items in a catalog

423

Interpret a brief phrase from a lengthy
news article

396

Using a table depicting information about
parental involvement in school survey to
write a paragraph summarizing extent to
which parents and teachers agree

405

Using information in news article,
calculate difference in times for
completing a race

375

500
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey, 1992.

10 . . . . . . Chapter 1

cost of carpet to cover a room

Figure 2

NALS
Description of the Prose, Document, and Quantitative Literacy Levels

Level 1
0-225

Level 2
226-275

Level 3
276-325

Level 4
326-375

Level 5
376-500

Prose

Document

Quantitative

Most of the tasks in this level require
the reader to read relatively short text to
locate a single piece of information
which is identical to or synonymous
with the information given in the
question or directive. If plausible but
incorrect information is present in the
text, it tends not to be located near the
correct information.

Tasks in this level tend to require the
reader either to locate a piece of
information based on a literal match or
to enter information from personal
knowledge onto a document. Little, if
any, distracting information is present.

Tasks in this level require readers to
perform single, relatively simple
arithmetic operations, such as addition.
The numbers to be used are provided
and the arithmetic operation to be
performed is specified.

Some tasks in this level require readers
to locate a single piece of information
in the text; however, several distractors
or plausible but incorrect pieces of
information may be present, or lowlevel inferences may be required. Other
tasks require the reader to integrate two
or more pieces of information or to
compare and contrast easily identifiable
information based on a criterion
provided in the question or directive.

Tasks in this level are more varied than
those in Level 1. Some require the
readers to match a single piece of
information; however, several
distractors may be present, or the match
may require low-level inferences. Tasks
in this level may also ask the reader to
cycle through information in a
document or to integrate information
from various parts of a document.

Tasks in this level typically require
readers to perform a single operation
using numbers that are either stated in
the task or easily located in the
material. The operation to be performed
may be stated in the question or easily
determined from the format of the
material (for example, an order form).

Tasks in this level tend to require
readers to make literal or synonymous
matches between the text and information
given in the task, or to make matches
that require low-level inferences. Other
tasks ask readers to integrate information
from dense or lengthy text that contains
no organizational aids such as headings.
Readers may also be asked to generate
a response based on information that
can be easily identified in the text.
Distracting information is present, but
is not located near the correct information.

Some tasks in this level require the
reader to integrate multiple pieces of
information from one or more
documents. Others ask readers to cycle
through rather complex tables or graphs
which contain information that is
irrelevant or inappropriate to the task.

In tasks in this level, two or more
numbers are typically needed to solve
the problem, and these must be found in
the material. The operation(s) needed
can be determined from the arithmetic
relation terms used in the question or
directive.

These tasks require readers to perform
multiple-feature matches and to
integrate or synthesize information
from complex or lengthy passages.
More complex inferences are needed
to perform successfully. Conditional
information is frequently present in
tasks at this level and must be taken
into consideration by the reader.

Tasks in this level, like those at the
previous levels, ask readers to perform
multiple-feature matches, cycle
through documents, and integrate
information; however, they require a
greater degree of inferencing. Many of
these tasks require readers to provide
numerous responses but do not
designate how many responses are
needed. Conditional information is
also present in the document tasks at
this level and must be taken into
account by the reader.

These tasks tend to require readers to
perform two or more sequential
operations or a single operation in
which the quantities are found in
different types of displays, or the
operations must be inferred from
semantic information given or drawn
from prior knowledge.

Some tasks in this level require the
reader to search for information in
dense text which contains a number of
plausible distractors. Others ask
readers to make high-level inferences
or use specialized background
knowledge. Some tasks ask readers to
contrast complex information.

Tasks in this level require the reader
to search through complex displays
that contain multiple distractors, to
make high-level text-based inferences,
and to use specialized knowledge.

These tasks require readers to perform
multiple operations sequentially. They
must disembed the features of the
problem from text or rely on
background knowledge to determine
the quantities or operations needed.

Source: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey, 1992.

Chapter 1 . . . . . . 11

About This Report
This report looks at the literacy skills of the prison population from several
vantage points. Chapter 2 profiles the literacy skills of the prison population
as a whole and of various subgroups defined by gender, race/ethnicity, age,
level of education, and disabilities and compares them with the household
population. Chapter 3 explores the relationship between inmates’ background
and experiences before incarceration and their literacy proficiencies. Chapter 4
discusses their prison experiences and the literacy proficiencies associated with
those who participate in various prison programs. In chapter 5, the relationship
between literacy proficiency and recidivism is examined. Chapter 6 explores
the comparison between the literacy practices of the prison and household
populations, as well as self-perceptions of their literacy ability.
In interpreting the results of this study, readers should bear in mind
that the literacy tasks contained in this assessment and the adults invited to
participate in the survey are samples drawn from their two respective
universes. As such, the results are subject to some measurable degree of
uncertainty. Scientific procedures employed in the study design and the scaling
of literacy tasks permit a high degree of confidence in the resulting estimates of
task difficulty. Similarly, the sampling design and weighting procedures applied
in this survey assure that participants’ responses can be generalized to the
populations of interest. Discussions of differences between various
subpopulations are based on statistical tests that consider the magnitude of the
differences (for example, the difference in average document proficiency
between high school and college graduates), the size of the standard errors
associated with the numbers being compared, and the number of comparisons
being made. Only statistically significant differences (at the .05 level) are
discussed in this report. Particularly because of the small sample size of the
prison population, readers who are interested in making their own comparisons
are therefore advised not to use the numbers alone to compare various groups,
but rather to rely on statistical tests.
The goal of this report is to provide useful information to all those who
wish to understand the current status of literacy among the nation’s prison
population and to strengthen existing literacy policies and programs. In
considering the results, the reader should keep in mind that this was a survey of
literacy in the English language — not literacy in any universal sense of the
word. Thus, the results do not capture the literacy resources and abilities that
respondents may possess in languages other than English.

12 . . . . . . Chapter 1

A Note on Interpretations
In reviewing the information contained in this report, readers should be aware
that no single factor determines what an individual’s literacy proficiencies will
be. All of us develop our own unique repertoire of competencies depending
on a wide array of conditions and circumstances, including our family
backgrounds, educational attainments, interests and aspirations, economic
resources, and employment experiences. Any single survey, this one included,
can focus on only some of these variables.
Further, while the results reveal certain characteristics that are related to
literacy, the nature of the survey makes it impossible to determine the direction
of these relationships. In other words, it is impossible to identify the extent to
which literacy shapes particular aspects of our lives or is, in turn, shaped by
them. For example, there is a strong relationship between educational
attainment and literacy proficiencies. On the one hand, it is likely that staying
in school longer does strengthen an individual’s literacy skills. On the other
hand, it is also true that those with more advanced skills tend to remain in
school longer. Other variables, as well, are likely to play a role in the
relationship between literacy and education. In interpreting such relationships
in this report, the authors strive to acknowledge the many factors involved.
A final note deserves emphasis. This report describes the literacy
proficiencies of various subpopulations defined by characteristics such as age,
sex, race, ethnicity, and educational background. While certain groups
demonstrate lower literacy skills than others, on average, within every group
there are some individuals who perform well and some who perform poorly.
Accordingly, when one group is said to have lower average proficiencies than
another, this does not imply that all adults in the first group perform worse
than those in the second. Such statements are only intended to highlight
general patterns of differences among various groups and, therefore, do not
capture the variability within each group.

Chapter 1 . . . . . . 13

14 . . . . . . Chapter 1

CHAPTER 2
The Prose, Document, and Quantitative
Literacy Skills of America’s Prisoners

A

t the same time that the National Adult Literacy Survey was assessing the
literacy skills of America’s households, a representative sample of state and
federal prison inmates was also assessed. There were two reasons for assessing
prisoners’ literacy skills: to increase the accuracy of national subpopulation
estimates and to provide information on the literacy skills of federal and state
prisoners. Prison inmates have been included in national figures published in
other reports from the survey to give a more precise estimate of the literacy
proficiencies of subgroups of the adult population. In this report, the literacy
levels of the prison and household groups are reported separately to provide a
profile of the literacy skills of prisoners and to compare them with the
household population. Prisons and not jails were selected for the survey
because the Federal government has been mandated by Congress “to assist
State and local educational programs for criminal offenders in correctional
institutions.”1 Prisons house about 65 percent of the incarcerated population
and inmates are generally held for longer periods of time than those confined
in jails and community-based facilities.
This chapter of the report describes multiple dimensions of prisoner
literacy and compares the prose, document, and quantitative literacy skills of
prisoners to those of the adult household population. This chapter also
examines and compares literacy proficiencies of the prisoner and household
populations by the subgroups of educational attainment, race/ethnicity, sex, and
age, as well as by disabilities.
In this report, the results are examined in two ways. First, general
comparisons of literacy proficiency are made by examining the average
performance of various subpopulations on each of the literacy scales. This is
the preferred method for this report because of the size of the sample of prison
inmates interviewed. Second, percentage distributions of each population in

1

Public Law 101-392 section 214 (a) (2) September 28, 1990.

Chapter 2 . . . . . . 15

the five levels of each scale are also presented to provide a range of literacy
proficiency for both the prison and household populations. As described in
chapter 1, five literacy levels were defined along the prose, document, and
quantitative scales: Level 1 (ranging from 0 to 225), Level 2 (226 to 275),
Level 3 (276 to 325), Level 4 (326 to 375), and Level 5 (376 to 500).

The Correctional Population at the End of 1991
At the end of 1991, an estimated 4,641,000 adults in this country were under
some form of correctional supervision (table 2.1 — data are from the Bureau
of Justice Statistics, 1991 and, therefore, are different from this survey’s
population estimates. The Bureau of Justice Statistics’ survey included
community-based as well a confinement facilities, whereas this survey included
only confinement facilities.). Approximately 17 percent of these individuals
were in prison, 13 percent on parole, 9 percent in jail, and 61 percent on
probation. About 1,350 persons out of every 100,000 in the general population
were serving probation or parole sentences while living among the general
population. Consequently, a little over 1 percent of the household population
as defined in this report were under criminal justice supervision in the
community at the time of the survey. For every 100,000 residents in the
general population, the jails and prisons of the country held about
480 prisoners — or about 0.5 percent of the size of the general population.
Persons held in prison constitute an even smaller number compared with
the general population — about 0.3 percent.

Table 2.1
Adult Correctional Population, Year-End 1991
CRIMINAL JUSTICE
STATUS

Number

Rate per 100,000
residents

Total
Probation
Jail
Prison
Parole

4,641,000**
2,819,000**
424,129**
792,176**
606,000**

1829.6
1111.3
167.2
312.3
238.9

**Estimated
**Includes both confinement and community-based facilities.
Source: Bureau of Justice Statistics, Correctional Populations in the United States, 1990 and
Correctional Populations in the United States, 1991.

16 . . . . . . Chapter 2

Characteristics of the Prison and Household Populations
Prison inmates differ from the household population with respect to many
demographic characteristics (table 2.2). Generally, prison inmates are more
likely to be male, minority, young, and less educated than the household
population. The vast majority of prison inmates are male (94 percent), while
less than half of the household population is male (48 percent). Almost twothirds of prison inmates belong to some racial or ethnic minority, compared
with about one-quarter of the household population. Almost 65 percent of
prison inmates are below the age of 35, compared with 40 percent of the
household population. About 20 percent of the prison population, compared
with 45 percent of the household population, has had some education beyond
high school; 49 percent of the prison population, compared with 24 percent of
the household population, did not complete either high school or a GED.

Results for the Prison Population
Thirty-one percent of prison inmates perform in Level 1 on the prose literacy
scale, 33 percent are in this level on the document scale, and 40 percent on
the quantitative scale (table 2.3). This means that approximately 237,000 to
306,000 of 766,000 prison inmates perform in the lowest level on each of the
literacy scales. Prison inmates at this level may be able to read short pieces of
text to find a single fact, enter personal information on a document, or add
numbers that are set up in a column format. Other inmates in Level 1,
however, do not demonstrate the ability to perform even these fairly
straightforward literacy tasks.
Performing in Level 2 are about 37 percent of prison inmates on the prose
scale, 38 percent on the document scale, and 32 percent on the quantitative
scale — about 245,000 to 291,000 prisoners. Prisoners at this level on the prose
scale can generally make low-level inferences based on what they read and
integrate two or more pieces of information. Those in Level 2 on the document
scale can locate a piece of information in a document in which plausible but
inexact information is present and can integrate information from various parts
of a document. Prisoners in Level 2 on the quantitative scale can correctly add,
subtract, multiply, or divide simple numbers found in a text.
Between 22 percent and 26 percent of prisoners — about 169,000 to
199,000 prisoners in all — could perform literacy tasks in Level 3. Prisoners in
this level on the prose scale could integrate information from relatively long or
dense text, and those in this level on the document scale could integrate

Chapter 2 . . . . . . 17

TABLE 2.2
Percentages of Adults in Prison and Household Populations, by
Various Demographic Characteristics
POPULATIONS

CHARACTERISTICS
Prison

Household

CPCT ( SE )

CPCT ( SE )

Gender
Male
Female

94 ( 0.0)
6 ( 0.0)

48 ( 0.0)
52 ( 0.0)

Race/Ethnicity
White
Black
Asian or Pacific Islander
American Indian or Alaskan Native
Other
Hispanic groups

35 (
44 (
1(
2(
0†(
18 (

0.6)
0.0)
0.2)
0.4)
0.2)
0.4)

76 (
11 (
2(
1(
0†(
10 (

0.5)
0.0)
0.2)
0.3)
0.1)
0.2)

Age
16 to 18
19 to 24
25 to 34
35 to 54
55 to 64
65 and older

2
21
41
33
2
1

(
(
(
(
(
(

0.4)
1.3)
1.8)
1.4)
0.4)
0.3)

5
13
22
34
10
16

(
(
(
(
(
(

0.1)
0.2)
0.2)
0.3)
0.2)
0.2)

Level of Education
0 to 8 years
9 to 12 years
High school diploma
GED
Some postsecondary
Postsecondary degree

14
35
14
17
16
4

(
(
(
(
(
(

0.1)
1.1)
1.1)
1.0)
0.8)
0.4)

10
14
28
4
22
23

(
(
(
(
(
(

0.3)
0.2)
0.2)
0.2)
0.2)
0.2)

CPCT = column percentage estimate; (SE) = standard error of the estimate (the true population value can be said to be within 2 standard
errors of the sample estimate with 95% certainty).
† Percentages less than 0.5 are rounded to 0.
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

18 . . . . . . Chapter 2

TABLE 2.3
Percentages at Each Level and Average Proficiencies on Each Literacy Scale
of Prison and Household Populations
LEVELS AND AVERAGE PROFICIENCY

LITERACY
SCALES BY
POPULATIONS

Level 1
225 or lower

Level 2
226 to 275

Level 3
276 to 325

Level 4
326 to 375

Level 5
376 or higher

Average
proficiency

WGT N
(/1,000)

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

PROF ( SE )

Prose
Prison
Household

1,147
766
24,944 190,524

31 ( 1.7)
21 ( 0.4)

37 ( 2.0)
27 ( 0.6)

26 ( 1.6)
32 ( 0.7)

6 ( 0.8)
17 ( 0.4)

0†( 0.2)
3 ( 0.2)

246 ( 1.9)
273 ( 0.6)

Document
Prison
Household

1,147
766
24,944 190,524

33 ( 2.1)
23 ( 0.4)

38 ( 2.1)
28 ( 0.5)

25 ( 1.5)
31 ( 0.5)

4 ( 0.9)
15 ( 0.4)

0†( 0.2)
3 ( 0.2)

240 ( 2.2)
267 ( 0.7)

Quantitative
Prison
Household

1,147
766
24,944 190,524

40 ( 1.9)
22 ( 0.5)

32 ( 2.2)
25 ( 0.6)

22 ( 1.9)
31 ( 0.6)

6 ( 1.0)
17 ( 0.3)

1 ( 0.4)
4 ( 0.2)

236 ( 3.1)
271 ( 0.7)

n

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); RPCT = row percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the true
population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
† Percentages less than 0.5 are rounded to 0.

Source: Educational Testing Service, National Adult Literacy Survey, 1992.

multiple pieces of information found in documents. Prisoners in Level 3 on the
quantitative scale could perform arithmetic operations using two or more
numbers found in printed material.
About 6 percent of inmates, or about 46,000, could successfully perform
Level 4 tasks on the prose scale and, thus, could synthesize information from
lengthy or complex passages. Four percent, about 31,000, are in Level 4 on the
document scale and are able to make inferences based on text and documents.
Six percent perform in Level 4 of the quantitative scale; they can perform
sequential arithmetic operations using numbers found in different types of
displays.
Less than 0.5 percent of prison inmates perform in Level 5 on the prose
and document scales and about 1 percent on the quantitative scale. To perform

Chapter 2 . . . . . . 19

in Level 5 on the prose scale, one must contrast complex information found in
written materials or make high level inferences or search for information in
dense text. Level 5 on the document scale requires readers to use specialized
knowledge and search complex displays for particular pieces of information.
To achieve this level on the quantitative scale, respondents must determine the
features of arithmetic problems either by examining text or by using
background knowledge and then perform the multiple arithmetic operations
required. Very few inmates, 8,000 or fewer, perform in this level.

Comparison of the Prison and Household Populations
Prisoners are more likely than the household population to perform in the
lower levels of the three scales (table 2.3). About one in three prison inmates
performs in Level 1 on the prose scale, compared with one in five of the
household population. About 33 percent of prison inmates and 23 percent of
the household population perform in Level 1 on the document scale, and
40 percent of prisoners and 22 percent of household respondents on the
quantitative scale. Thus, the differences in percentages performing in Level 1
are 10 for the prose and document scales and 18 for the quantitative scale.
Compared with the household population, relatively few prisoners
perform in Levels 4 and 5. Six percent of prisoners, compared with 20 percent
of the household population, are in Levels 4 and 5 on the prose scale; 4 percent
of inmates, compared with 18 percent of the household population, on the
document scale; and 7 percent of prisoners, compared with 21 percent of
householders, on the quantitative scale.
Average proficiency scores on the three scales are also lower for prisoners
than for the household population. Although both prisoners and householders,
on average, perform in Level 2, prisoners’ proficiency scores are 27 points
lower on the prose and document scales than those of householders and
35 points lower on the quantitative scale, placing inmates below the middle
of the Level 2 range.
Thus, prisoners consistently demonstrate lower proficiency than the
household population on all three scales, whether measured by the distribution
of prisoners in the levels of each scale or by their average proficiency scores.

20 . . . . . . Chapter 2

Comparisons of the Prison and Household Populations
by Education, Race/Ethnicity, Sex, and Age
As was pointed out previously, the prisoner population varies from the household
population by such characteristics as gender, race/ethnicity, educational
attainment, and age. The next question is whether these differences are related
to the lower scores of prisoners as compared with those of the household
population. For example, are proficiency scores of the prison population lower
because fewer of them attained higher levels of education as compared with
the household population? In this section, each of these background
characteristics, namely, level of education, race/ethnicity, gender, and age, will
be examined separately. Also, race/ethnicity and educational attainment will
be combined and the proficiencies compared within each group to see if
overrepresentation of minorities and of those with low levels of education can
explain the differences in the proficiencies of prisoners and householders.
Finally results of a regression analysis will be discussed.
Results by Educational Attainment
As shown in table 2.2, the prison population attained lower levels of education
than the household population. Greater percentages of prisoners than
householders attained less than a high school diploma; 14 percent of prisoners
have 0 to 8 years of education, compared with 10 percent of householders, and
35 percent of prisoners have 9 to 12 years of education compared with 14 percent
of householders. While a greater percentage of householders than prisoners
have a high school diploma (28 and 14 percent, respectively), 17 percent of
prisoners have a GED, compared with 4 percent of householders. It may be
that many prisoners have gained their high school equivalency certificate while
in prison. Nonetheless, 31 percent of prisoners and 32 percent of householders
have a high school diploma or GED. On the other hand, a lower percentage of
prisoners than householders have some postsecondary education (16 compared
with 22 percent) or a postsecondary degree (4 compared with 23 percent).
Not surprisingly, since the business of educational institutions is to
transmit literacy skills, literacy scores are highly related to educational
attainment (table 2.4). First of all, the average proficiencies of inmates are
related to educational attainment. The proficiencies of inmates with some
postsecondary education range from 275 to 285 on the three scales and are
higher than the proficiencies of inmates with a high school diploma. These
inmates, as well as those who have a GED, demonstrate higher proficiencies
than inmates with 9 to 12 years of education. Inmates with 0 to 8 years of

Chapter 2 . . . . . . 21

education demonstrate lower proficiencies, ranging from 176 to 196 on the
three scales, than those with 9 to 12 years. While the proficiencies of inmates
with a GED appear to be higher than those of inmates with a high school
diploma, the differences do not reach statistical significance. Inmates with a
GED, however, demonstrate proficiencies that are the same as those of
inmates with postsecondary education on the prose and quantitative scales.
Second, as the educational attainment of inmates increases, the
percentage of inmates performing in Level 1 generally decreases. For example,
65 to 70 percent of prisoners who reported 0 to 8 years of education perform in
Level 1 on the three scales, compared with 41 to 51 percent of prisoners with
9 to 12 years of education. On all three scales, the percentages of inmates with
a high school diploma in Level 1 are about the same as those of inmates with
9 to 12 years of education; however, the percentages of inmates with a GED
performing in Level 1 (10 to 21 percent) are less than those of inmates with
9 to 12 years. Furthermore, about the same percentages of inmates with a
GED as inmates with some postsecondary (10 to 15 percent) perform in
Level 1, while the percentages in Level 1 for high school graduates are greater
than the percentages for inmates with some postsecondary education.
When inmate and household populations are compared, inmates with a
GED demonstrate proficiencies comparable to those of householders with
a GED — about 270 on the prose scale and 265 on the document and
quantitative scales. In addition, about the same percentages of both prison and
household GED holders perform in Level 1 on all three scales. In contrast, the
average proficiencies of inmates with a high school diploma are lower on all
three scales than those of householders with the same education level.
Furthermore, significantly more inmate graduates than household graduates
perform in Level 1 on the quantitative scale and significantly fewer inmates
perform in Level 4 on the prose scale. Thus, inmates with a GED appear to
have an advantage over inmates with a high school diploma. Inmate GED
holders demonstrate about the same proficiencies as inmates with some
postsecondary education, while inmate high school graduates demonstrate
lower proficiencies. In addition, inmate GED holders demonstrate the same
proficiencies as household GED holders. In contrast, inmate graduates
demonstrate lower proficiencies than their household counterparts.

22 . . . . . . Chapter 2

Table 2.4
Percentages at Each Level and Average Proficiencies on Each Literacy
Scale of Prison and Household Populations, by Educational Attainment
LEVELS AND AVERAGE PROFICIENCIES
Level 1
225 or
lower

LITERACY
SCALES BY
EDUCATION
BY POPULATIONS
n

Level 2
226-275

Level 3
276-325

Level 4
326-375

Level 5
376 or
Average
higher proficiency

WGT N
(/1,000) RPCT (SE) RPCT (SE) RPCT (SE) RPCT (SE) RPCT (SE) PROF (SE)

Prose
0 to 8 years
Prison
Household

157
2,167

107
18,356

66 (4.2)
75 (1.7)

24 (3.8)
20 (1.4)

10 (4.0)
4 (0.9)

1 (0.6)
0*(0.3)

0*(0.0)
0*(0.0)

196 (5.0)
177 (2.6)

9 to 12 years
Prison
Household

385
3,311

271
24,982

41 (3.1)
42 (1.4)

44 (3.5)
38 (1.1)

14 (2.4)
17 (1.0)

1 (0.6)
2 (0.4)

0*(0.0)
0*(0.1)

230 (3.0)
231 (1.5)

GED
Prison
Household

183
1,062

130
7,224

10 (3.1)
14 (1.6)

44 (4.9)
39 (2.5)

39 (5.6)
39 (2.8)

6 (3.0)
7 (1.2)

0*(0.3)
0*(0.6)

270 (4.3)
268 (1.8)

High school
Prison
Household

154
6,107

107
51,290

25 (5.3)
16 (0.8)

39 (5.0)
36 (1.3)

32 (6.0)
37 (1.7)

5 (2.0)
10 (0.9)

0*(0.0)
1 (0.2)

255 (5.0)
270 (1.1)

264
12,143

149
80,426

10 (2.0)
5 (0.3)

27 (3.6)
17 (0.6)

42 (3.7)
39 (0.8)

19 (3.4)
32 (0.8)

2 (1.1)
7 (0.4)

285 (3.7)
310 (0.8)

0 to 8 years
Prison
Household

157
2,167

107
18,356

69 (3.6)
79 (1.7)

23 (4.1)
18 (1.6)

7 (2.6)
3 (0.8)

1 (0.5)
0*(0.1)

0*(0.0)
0*(0.0)

176 (6.1)
170 (2.4)

9 to 12 years
Prison
Household

385
3,311

271
24,982

41 (3.0)
46 (1.7)

43 (3.9)
37 (1.6)

14 (2.7)
15 (1.3)

2 (1.0)
2 (0.4)

0*(0.0)
0*(0.1)

230 (2.8)
227 (1.6)

GED
Prison
Household

183
1,062

130
7,224

16 (3.3)
17 (2.0)

47 (6.2)
42 (2.7)

32 (5.0)
34 (2.3)

4 (2.7)
7 (1.1)

0*(0.3)
0*(0.5)

263 (4.3)
264 (2.2)

High school
Prison
Household

154
6,107

107
51,290

27 (4.9)
20 (0.8)

37 (5.7)
38 (1.0)

32 (4.7)
33 (1.1)

4 (2.4)
9 (0.6)

0*(0.0)
1 (0.2)

251 (5.6)
264 (1.1)

264
12,143

149
80,426

11 (2.1)
6 (0.3)

31 (3.1)
20 (0.6)

44 (4.1)
39 (0.6)

13 (2.9)
29 (0.8)

1 (0.8)
6 (0.4)

279 (3.0)
303 (0.7)

Postsecondary
Prison
Household

Document

Postsecondary
Prison
Household

n = sample size; WGT N = population size estimate /1,000 (the sample sizes for subpopulations may not add up to the total
sample sizes because of missing data); RPCT = row percentage estimate; PROF = average proficiency estimate; (SE) =
standard error of the estimate (the true population value can be said to be within 2 standard errors of the sample estimate with
95% certainty).
* Percentages less than 0.5 are rounded to zero.
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Survey, 1992.

Chapter 2 . . . . . . 23

Table 2.4 (continued)
Percentages at Each Level and Average Proficiencies on Each Literacy
Scale of Prison and Household Populations, by Educational Attainment
LEVELS AND AVERAGE PROFICIENCIES
Level 1
225 or
lower

LITERACY
SCALES BY
EDUCATION
BY POPULATIONS
n

Level 2
226-275

Level 3
276-325

Level 4
326-375

Level 5
376 or
Average
higher proficiency

WGT N
(/1,000) RPCT (SE) RPCT (SE) RPCT (SE) RPCT (SE) RPCT (SE) PROF (SE)

Quantitative
0 to 8 years
Prison
Household

157
2,167

107
18,356

70 (5.1)
76 (2.0)

21 (3.5)
18 (1.8)

7 (2.6)
5 (1.1)

2 (1.4)
1 (0.3)

0*(0.4)
0*(0.2)

182 (8.4)
169 (3.1)

9 to 12 years
Prison
Household

385
3,311

271
24,982

51 (2.8)
45 (1.6)

34 (3.4)
34 (1.6)

13 (2.1)
17 (1.3)

2 (0.9)
3 (0.6)

0*(0.3)
0*(0.1)

219 (3.5)
227 (1.7)

GED
Prison
Household

183
1,062

130
7,224

21 (5.2)
16 (2.0)

40 (5.6)
38 (2.5)

32 (5.7)
35 (2.5)

7 (2.5)
10 (1.4)

0*(1.4)
1 (0.5)

263 (4.6)
268 (2.7)

High school
Prison
Household

154
6,107

107
51,290

36 (5.0)
18 (0.8)

32 (5.8)
33 (1.1)

26 (4.3)
37 (1.1)

6 (3.0)
12 (0.5)

0*(0.3)
1 (0.2)

244 (6.7)
270 (1.1)

264
12,143

149
80,426

15 (3.2)
5 (0.3)

30 (4.5)
17 (0.6)

37 (4.5)
38 (0.8)

15 (3.1)
31 (0.7)

3 (1.1)
8 (0.4)

276 (3.4)
310 (0.9)

Postsecondary
Prison
Household

n = sample size; WGT N = population size estimate /1,000 (the sample sizes for subpopulations may not add up to the total
sample sizes because of missing data); RPCT = row percentage estimate; PROF = average proficiency estimate; (SE) =
standard error of the estimate (the true population value can be said to be within 2 standard errors of the sample estimate with
95% certainty).
* Percentages less than 0.5 are rounded to zero.
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Survey, 1992.

Results by Race/Ethnicity
On all three literacy scales, White prisoners demonstrate higher average
literacy skills than Black prisoners, who show greater proficiency than Hispanic
prisoners (table 2.5). The proficiency scores of White prisoners are, on average,
28 points higher than those of Black prisoners and 62 points higher than scores
of Hispanic prisoners on the prose scale; the scores of Black prisoners are
34 points higher than those of Hispanic prisoners. On the document scale, the
proficiency scores of White inmates are an average of 38 points higher than the
scores of Black inmates, and Black inmates’ scores are 34 points higher than
Hispanic inmates’ scores. On the quantitative scale, the proficiency scores of

24 . . . . . . Chapter 2

TABLE 2.5
Percentages at Each Level and Average Proficiencies on Each Literacy Scale
of Prison and Household Populations, by Race/Ethnicity
LEVELS AND AVERAGE PROFICIENCY

LITERACY
SCALES BY
RACE/ETHNICITY
BY POPULATIONS

Level 1
225 or lower

Document
White
Prison
Household
Black
Prison
Household
Hispanic
Prison
Household
Quantitative
White
Prison
Household
Black
Prison
Household
Hispanic
Prison
Household

Level 3
276 to 325

Level 4
326 to 375

Level 5
376 or higher

Average
proficiency

WGT N
(/1,000)

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

PROF ( SE )

417
266
16,875 144,702

17 ( 2.2)
14 ( 0.4)

35 ( 2.6)
25 ( 0.6)

36 ( 3.0)
36 ( 0.8)

11 ( 2.3)
21 ( 0.5)

1 ( 0.5)
4 ( 0.3)

270 ( 3.3)
286 ( 0.7)

n

Prose
White
Prison
Household
Black
Prison
Household
Hispanic
Prison
Household

Level 2
226 to 275

480
4,483

340
20,852

33 ( 2.4)
38 ( 1.2)

43 ( 3.4)
37 ( 1.4)

21 ( 2.5)
21 ( 1.0)

3 ( 0.8)
4 ( 0.6)

0†( 0.2)
0†( 0.2)

242 ( 2.6)
237 ( 1.4)

211
2,915

134
18,347

55 ( 3.3)
49 ( 1.5)

27 ( 3.5)
26 ( 1.4)

16 ( 3.8)
19 ( 1.5)

2 ( 1.1)
6 ( 0.8)

0†( 0.3)
1 ( 0.3)

208 ( 4.6)
215 ( 2.2)

417
266
16,875 144,702

17 ( 2.6)
16 ( 0.5)

34 ( 2.9)
27 ( 0.6)

39 ( 2.4)
34 ( 0.7)

10 ( 2.2)
19 ( 0.5)

1 ( 0.4)
3 ( 0.2)

270 ( 3.4)
280 ( 0.8)

480
4,483

340
20,852

39 ( 3.5)
43 ( 1.0)

44 ( 3.2)
36 ( 1.2)

16 ( 2.0)
18 ( 0.9)

1 ( 0.7)
3 ( 0.4)

0†( 0.0)
0†( 0.1)

232 ( 3.2)
230 ( 1.2)

211
2,915

134
18,347

53 ( 3.6)
50 ( 1.7)

29 ( 3.9)
26 ( 1.6)

16 ( 3.2)
18 ( 1.4)

2 ( 1.7)
5 ( 0.8)

0†( 0.4)
1 ( 0.3)

198 ( 4.9)
213 ( 2.5)

417
266
16,875 144,702

21 ( 2.5)
14 ( 0.5)

31 ( 3.7)
24 ( 0.6)

34 ( 3.1)
35 ( 0.7)

12 ( 2.4)
21 ( 0.4)

2 ( 0.7)
5 ( 0.2)

267 ( 3.4)
287 ( 0.8)

480
4,483

340
20,852

49 ( 3.1)
46 ( 1.0)

34 ( 2.9)
34 ( 1.1)

15 ( 2.1)
17 ( 1.0)

2 ( 0.8)
3 ( 0.4)

0†( 0.1)
0†( 0.1)

223 ( 4.5)
224 ( 1.4)

211
2,915

134
18,347

57 ( 3.7)
50 ( 1.3)

26 ( 4.7)
25 ( 1.3)

14 ( 3.8)
19 ( 1.3)

3 ( 1.9)
5 ( 1.1)

0†( 0.6)
1 ( 0.2)

201 ( 6.1)
212 ( 2.5)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); RPCT = row percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the true
population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
† Percentages less than 0.5 are rounded to 0.

Source: Educational Testing Service, National Adult Literacy Survey, 1992.

Chapter 2 . . . . . . 25

White inmates are 44 points higher on average than those of Black inmates and
66 points higher than those of Hispanic inmates. These differences among
inmates mirror differences among adults in the household population, in which
the proficiency scores of White adults are 49 to 63 points higher than those of
Black adults and the scores of Black adults are 12 to 22 points higher than
those of Hispanic adults, on average.
White prisoners demonstrate lower proficiencies than White adults in
the household population on all three literacy scales. On average, their prose
proficiency is 16 points lower, their document proficiency 10 points lower,
and their quantitative 20 points lower than the proficiencies of the White
household population. On the other hand, the proficiency scores of Black and
Hispanic prisoners tend to be about the same as those of their counterparts in
the household population. The exception occurs among Hispanic adults on the
document scale, where prisoners demonstrate lower average proficiency than
adults in households (198 compared with 213, respectively).
Results by Educational Attainment and Race/Ethnicity
Since both educational attainment and race are related to literacy proficiency,
we will examine them together to see if both factors act to depress the literacy
scores of the prison population as compared with the household population. As
was previously noted, the racial composition of the prison population differs
from that of the household population. While the prison population is 35
percent White, 44 percent Black, and 18 percent Hispanic, the household
population is 76 percent White, 11 percent Black, and 10 percent Hispanic.
Minority adults, both in prisons and households, have less education, on
average, than White adults. In the prison population, White inmates average
11.3 years of schooling, Black inmates 10.8, and Hispanic inmates 9.6. In the
household population, White adults average 12.8 years of schooling, Black
adults 11.6 years, and Hispanic adults 10.2 years (table 2.6).
Prisoners have less education on average than their racial or ethnic
counterparts in the household population. White prisoners have about one and
one-half years less education than White householders, Black prisoners have
almost a year less than Black householders, and Hispanic prisoners have about
one-half year less than Hispanic householders. Only about 2 in 10 White
householders have less than a high school diploma while almost 4 in 10 White
prisoners have not graduated from high school or received a GED. Over half of
the Black prisoners, compared with about one-third of Black adults in the
household population, have not completed high school. Sixty-one percent of
Hispanic inmates, compared with 46 percent of Hispanic householders, have
less than a high school diploma. While minorities are overrepresented in the

26 . . . . . . Chapter 2

Table 2.6
Percentages of Prison and Household Populations
Attaining Each Education Level and Average Years
of Schooling, by Race/Ethnicity
EDUCATION
RACE/
ETHNICITY BY
POPULATIONS

0 to 12
years

n

High
school/GED

WGT N
(/1,000) RPCT (SE) RPCT (SE)

PostAverage years
secondary of schooling

RPCT (SE)

Mean

White
Prison
Household

417
16,298

266
139,011

38 (2.0)
19 (0.3)

38 (2.2)
33 (0.3)

25 (1.3)
48 (0.4)

11.3 (0.1)
12.8 (0.0)

479
4,252

339
19,496

55 (1.8)
35 (1.2)

29 (1.6)
33 (1.1)

16 (0.9)
33 (1.2)

10.8 (0.1)
11.6 (0.1)

208
2,731

132
17,047

61 (3.3)
46 (1.2)

23 (2.7)
26 (1.1)

16 (1.7)
28 (1.1)

9.6 (0.2)
10.2 (0.1)

Black
Prison
Household

Hispanic
Prison
Household

n = sample size; WGT N = population size estimate /1,000 (the sample sizes for subpopulations may not add up
to the total sample sizes because of missing data); RPCT = row percentage estimate; PROF = average
proficiency estimate; (SE) = standard error of the estimate (the true population value can be said to be
within 2 standard errors of the sample estimate with 95% certainty).
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Survey, 1992.

prison population compared with the household population and in general tend
to have less education than White adults, even within racial/ethnic groups
prisoners have less education on average than their racial/ethnic counterparts
in the household population.
Much of the difference in performance between those in prison and those
in the household population disappears when racial or ethnic background and
educational attainment are accounted for (table 2.7). In fact, White, Black, and
Hispanic inmates without a high school diploma or GED perform, on average,
better than their counterparts in the household population. The proficiency of
White inmates without a high school diploma is 15 points higher than that of
corresponding White adults in the household population on the prose scale,
23 points higher on the document scale, and 10 points higher on the
quantitative scale. The proficiency scores of Black prisoners who had not
finished high school are 30 points higher than those of Black householders with
the same educational background on the prose, document, and quantitative
scales. The scores of Hispanic inmates without a diploma are 25 points higher
than those of Hispanic householders who had not completed high school on
both the prose and quantitative scales, and 13 points higher on the document
scale, although this difference does not reach statistical significance.

Chapter 2 . . . . . . 27

White and Black prisoners who have received a high school diploma or a
GED demonstrate about the same literacy proficiency as their counterparts in
the household population with the same level of education. The proficiency
scores for both White prisoners and householders with a high school diploma
or GED average about 275 on the prose, document, and quantitative scales.
The proficiencies for Black prisoners and householders with a high school
education range between 233 and 242 on the document and quantitative scales,
with statistically insignificant differences between the two groups. On the prose
scale, however, Black prisoners with a high school education perform, on
average, 13 points higher than their household counterparts, a difference that
is statistically significant. (It is not possible to compare the Hispanic inmates
with the Hispanic householders as the inmate sample size is not large enough
to make reliable estimates.)
White inmates with any postsecondary education do not perform as well
as White householders on the prose and quantitative scales, while on the
document scale White inmates and householders with postsecondary education
perform about the same (around 305). Black inmates with educational
experience beyond high school perform about the same on all three scales as
Black householders with the same educational experience. (Once again, the
Hispanic inmate sample size is not large enough to make reliable estimates;
therefore, a comparison can not be made with the household population.)
Thus, when compared by racial/ethnic background and educational
experience, the performance of prisoners with less than a high school
education is better than that of householders, while the performance of
prisoners with a high school diploma or more is generally similar to or, in a few
instances, lower than that of householders. The low literacy proficiencies of the
total prison population in comparison with the total household population are
primarily due to the large proportion of minority racial/ethnic groups and
persons with lower levels of education in the prison population. When these
variables are taken into account, prisoners score the same or better than their
racial or ethnic counterparts in the household population. One notable pattern,
however, is that for both prisoners and householders, differences remain
among the racial/ethnic groups. White adults at each educational level perform
better than Black and Hispanic adults on all three scales, and Black adults
perform better than Hispanic adults.

28 . . . . . . Chapter 2

TABLE 2.7
Average Proficiencies on Each Literacy Scale of Prison and Household
Populations, by Race/Ethnicity and by Level of Education
LEVEL OF EDUCATION

LITERACY
SCALES BY
RACE/ETHNICITY
BY POPULATIONS

0 to 12 years

Document
White
Prison
Household
Black
Prison
Household
Hispanic
Prison
Household
Quantitative
White
Prison
Household
Black
Prison
Household
Hispanic
Prison
Household

Postsecondary

WGT N
(/1,000)

PROF ( SE )

PROF ( SE )

PROF ( SE )

417
266
16,298 139,011

242 ( 4.0)
227 ( 1.7)

276 ( 4.7)
278 ( 1.1)

304 ( 5.1)
317 ( 0.9)

n

Prose
White
Prison
Household
Black
Prison
Household
Hispanic
Prison
Household

High school/GED

479
4,252

339
19,496

223 ( 3.2)
193 ( 2.7)

255 ( 4.0)
242 ( 1.6)

283 ( 6.5)
275 ( 1.7)

208
2,731

132
17,047

186 ( 5.9)
161 ( 3.4)

*** ( ****)
241 ( 3.8)

*** ( ****)
275 ( 3.0)

417
266
16,298 139,011

243 ( 3.9)
220 ( 2.0)

276 ( 5.2)
271 ( 1.1)

303 ( 3.8)
310 ( 0.8)

479
4,252

339
19,496

216 ( 4.3)
186 ( 2.2)

242 ( 4.0)
235 ( 1.5)

267 ( 4.8)
267 ( 1.9)

208
2,731

132
17,047

171 ( 7.1)
158 ( 3.7)

*** ( ****)
241 ( 4.3)

*** ( ****)
274 ( 2.8)

417
266
16,298 139,011

234 ( 4.6)
224 ( 2.1)

275 ( 4.5)
279 ( 1.1)

304 ( 4.9)
318 ( 0.9)

479
4,252

339
19,496

206 ( 6.1)
176 ( 2.6)

238 ( 6.7)
233 ( 1.9)

257 ( 5.3)
266 ( 1.9)

208
2,731

132
17,047

180 ( 8.2)
155 ( 3.4)

*** ( ****)
240 ( 4.2)

*** ( ****)
276 ( 3.0)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); PROF = average proficiency estimate; (SE) = standard error of the estimate (the true population value can be said to
be within 2 standard errors of the sample estimate with 95% certainty).
*** Sample size is insufficient to permit a reliable estimate (fewer than 45 respondents).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

Chapter 2 . . . . . . 29

Results by Sex
Male and female prisoners demonstrate about the same average proficiencies
on the literacy scales (table 2.8). They average about 250 on the prose scale,
240 on the document scale, and 235 on the quantitative scale. In contrast, in
the household population men perform slightly better than women on the
document and quantitative scales. About the same percentage of male and
female inmates perform in each level on each of the scales, with over
60 percent of each group performing in Levels 1 and 2.

TABLE 2.8
Percentages at Each Level and Average Proficiencies on Each Literacy Scale
of Prison and Household Populations, by Sex
LEVELS AND AVERAGE PROFICIENCY

LITERACY
SCALES BY SEX
BY POPULATIONS

Level 1
225 or lower

Document
Male
Prison
Household
Female
Prison
Household
Quantitative
Male
Prison
Household
Female
Prison
Household

Level 3
276 to 325

Level 4
326 to 375

Level 5
376 or higher

Average
proficiency

WGT N
(/1,000)

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

PROF ( SE )

1,076
10,694

723
91,376

31 ( 1.7)
22 ( 0.6)

37 ( 2.0)
26 ( 0.9)

25 ( 1.7)
31 ( 1.2)

6 ( 0.7)
18 ( 0.6)

0†( 0.3)
4 ( 0.3)

246 ( 1.9)
272 ( 0.9)

71
14,208

43
98,858

27 ( 6.7)
20 ( 0.5)

35 ( 3.5)
28 ( 0.7)

33 ( 6.5)
33 ( 0.7)

5 ( 5.1)
17 ( 0.5)

0†( 0.0)
3 ( 0.2)

252 ( 7.6)
273 ( 0.8)

1,076
10,694

723
91,376

33 ( 2.2)
23 ( 0.6)

38 ( 2.2)
26 ( 0.5)

25 ( 1.5)
31 ( 0.8)

4 ( 1.0)
17 ( 0.5)

0†( 0.2)
3 ( 0.3)

240 ( 2.2)
269 ( 0.9)

71
14,208

43
98,858

32 ( 6.8)
23 ( 0.6)

38 ( 6.8)
30 ( 0.7)

25 ( 8.1)
31 ( 0.6)

5 ( 2.2)
14 ( 0.5)

0†( 0.0)
2 ( 0.2)

244 ( 9.2)
265 ( 0.9)

1,076
10,694

723
91,376

40 ( 1.9)
21 ( 0.7)

32 ( 2.3)
23 ( 0.5)

22 ( 2.0)
31 ( 0.6)

6 ( 1.1)
20 ( 0.4)

1 ( 0.4)
5 ( 0.3)

236 ( 3.1)
277 ( 0.9)

71
14,208

43
98,858

43 ( 6.5)
23 ( 0.5)

32 ( 8.3)
28 ( 0.9)

21 ( 7.2)
31 ( 1.0)

4 ( 1.8)
15 ( 0.6)

1 ( 1.3)
3 ( 0.3)

234 ( 9.7)
266 ( 0.9)

n

Prose
Male
Prison
Household
Female
Prison
Household

Level 2
226 to 275

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); RPCT = row percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the true
population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
† Percentages less than 0.5 are rounded to 0.

Source: Educational Testing Service, National Adult Literacy Survey, 1992.

30 . . . . . . Chapter 2

Male prisoners demonstrate lower literacy proficiencies than males in
the household population on all three scales. Furthermore, about 68 percent
of male prisoners, compared with 48 percent of males in the household
population, are in Levels 1 and 2 on the prose scale. Seven in 10 male
prisoners and 5 in 10 male householders perform in the two lowest levels
of the document scale. Almost three-quarters of male prisoners and slightly
under one-half of males in the household population are in Levels 1 and 2
on the quantitative scale.
Although higher percentages of female prisoners than females in the
household population perform in Level 1 on the prose and document scales, the
differences are not statistically significant. Because of the small number of
women in prison, there were only a small number of women in the prison sample.
The large standard errors of the female prison population reflect the variability
due to sampling as well as to measurement errors. It is, therefore, often
impossible to tell if observed differences are the result of sampling variability or
a true difference. A significantly higher percentage of women prisoners than
female householders are in Level 1, however, on the quantitative scale (43 percent
vs. 23 percent). When average proficiency scores are compared, those of
female prisoners are lower than those of female householders on all three
scales.

Results by Age
On the prose and quantitative scales, prisoners perform about the same, on
average, for all three age groups, 16 to 24, 25 to 34, and 35 and older (table
2.9). Although the prose proficiency of prisoners under 25 years of age is 252,
it is not statistically different from that of prisoners 35 and older, who average
241. On the document scale, the proficiency scores of the two groups of
prisoners under the age of 35 are significantly higher (251 and 243) than the
scores of those who are 35 or older (230).
Clearly, prisoners demonstrate lower average proficiencies than
householders in their age group on all three literacy scales. In addition, about
two-thirds of prisoners younger than 25 years of age perform in Levels 1 and 2
on the prose scale, while less than half of householders are in those levels. An
estimated 68 percent of prisoners age 25 to 34 perform in Levels 1 and 2 on
the prose scale; 41 percent of the household population age 25 to 34 are in
these levels on this scale. Almost 7 in 10 prisoners age 35 or older perform in
the lowest two levels on the prose scale, while only 5 in 10 of the household
population do. Patterns on the document and quantitative scales are similar to
those on the prose scale.

Chapter 2 . . . . . . 31

TABLE 2.9
Percentages at Each Level and Average Proficiencies on Each Literacy Scale
of Prison and Household Populations, by Age
LEVELS AND AVERAGE PROFICIENCY

LITERACY
SCALES BY AGE
BY POPULATIONS

Level 1
225 or lower

Document
16 to 24
Prison
Household
25 to 34
Prison
Household
35 and older
Prison
Household
Quantitative
16 to 24
Prison
Household
25 to 34
Prison
Household
35 and older
Prison
Household

Level 3
276 to 325

Level 4
326 to 375

Level 5
376 or higher

Average
proficiency

WGT N
(/1,000)

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

PROF ( SE )

281
4,300

174
34,764

26 ( 3.1)
15 ( 0.9)

42 ( 4.6)
31 ( 1.4)

25 ( 4.2)
37 ( 1.4)

5 ( 2.1)
16 ( 1.1)

0†( 0.2)
2 ( 0.3)

252 ( 3.2)
278 ( 1.0)

474
6,227

316
41,009

31 ( 2.8)
16 ( 0.7)

37 ( 3.1)
25 ( 1.0)

26 ( 2.6)
34 ( 0.8)

5 ( 1.1)
21 ( 1.0)

0†( 0.3)
4 ( 0.4)

247 ( 3.1)
283 ( 1.3)

389
273
14,408 114,712

35 ( 3.2)
24 ( 0.5)

34 ( 2.8)
26 ( 0.7)

25 ( 2.2)
30 ( 0.8)

6 ( 1.8)
16 ( 0.5)

1 ( 0.5)
3 ( 0.3)

241 ( 3.5)
267 ( 0.8)

n

Prose
16 to 24
Prison
Household
25 to 34
Prison
Household
35 and older
Prison
Household

Level 2
226 to 275

281
4,300

174
34,764

27 ( 3.2)
14 ( 0.7)

41 ( 5.1)
30 ( 1.2)

27 ( 4.4)
38 ( 1.5)

5 ( 2.0)
16 ( 1.2)

0†( 0.2)
2 ( 0.3)

251 ( 3.4)
279 ( 1.1)

474
6,227

316
41,009

32 ( 3.0)
16 ( 0.7)

37 ( 3.5)
25 ( 0.7)

26 ( 2.9)
35 ( 0.8)

4 ( 1.2)
21 ( 0.9)

0†( 0.2)
4 ( 0.3)

243 ( 3.4)
281 ( 1.2)

389
273
14,408 114,712

38 ( 3.7)
28 ( 0.7)

36 ( 2.8)
29 ( 0.6)

22 ( 3.0)
27 ( 0.5)

5 ( 1.4)
13 ( 0.6)

0†( 0.3)
2 ( 0.2)

230 ( 4.2)
258 ( 1.0)

281
4,300

174
34,764

40 ( 3.6)
17 ( 0.9)

33 ( 3.3)
30 ( 1.1)

22 ( 4.4)
36 ( 1.0)

5 ( 1.4)
15 ( 0.9)

1 ( 1.2)
2 ( 0.4)

240 ( 4.3)
275 ( 1.1)

474
6,227

316
41,009

39 ( 2.9)
17 ( 0.7)

33 ( 3.7)
24 ( 0.7)

21 ( 2.5)
34 ( 0.8)

6 ( 1.2)
20 ( 0.8)

1 ( 0.4)
5 ( 0.5)

237 ( 3.5)
282 ( 1.1)

389
273
14,408 114,712

41 ( 3.9)
25 ( 0.7)

30 ( 3.7)
24 ( 0.7)

22 ( 2.8)
29 ( 0.7)

6 ( 1.7)
17 ( 0.5)

1 ( 0.6)
4 ( 0.3)

231 ( 5.2)
267 ( 1.0)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); RPCT = row percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the true
population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
† Percentages less than 0.5 are rounded to 0.

Source: Educational Testing Service, National Adult Literacy Survey, 1992.

32 . . . . . . Chapter 2

Disabilities
Both prisoners and householders responded to a series of questions asking
them whether they had any disabilities. Table 2.10 shows the percentages in
each population reporting whether or not they had any disabilities and their
proficiency scores on the three scales. A significantly higher percentage of the
prison population (36 percent) than the household population (26 percent)
reported having at least one disability. (The higher reported incidence of
disabilities may be due, in part, to prisoners being more aware of disabilities
than householders because of having been evaluated by the criminal justice
system.) Furthermore, the proficiency scores of those in the prison population
who reported any disabilities are significantly lower than the scores of their
household counterparts, except on the document scale.

TABLE 2.10
Percentages and Average Proficiencies on Each Literacy Scale
of Prison and Household Populations, by Disability
LITERACY SCALES

NUMBER OF DISABILITIES
BY POPULATIONS

n

No Disability
Prison
Household
One or More Disabilities
Prison
Household

1,136

WGT N
(/1,000)

766

24,832 190,524

1,136

766

24,832 190,524

Prose

Document

Quantitative

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

63
253
74
283

(
(
(
(

1.7)
2.3)
0.5)
0.6)

63
247
74
278

(
(
(
(

1.7)
2.7)
0.5)
0.6)

63
244
74
283

(
(
(
(

1.7)
3.7)
0.5)
0.7)

36
233
26
243

(
(
(
(

1.6)
3.9)
0.5)
1.3)

36
227
26
235

(
(
(
(

1.6)
4.6)
0.5)
1.4)

36
220
26
239

(
(
(
(

1.6)
5.0)
0.5)
1.6)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

Chapter 2 . . . . . . 33

Table 2.11 shows the percentages of each population reporting various
disabilities and the average proficiency scores on each scale associated with
each disability. (The column percentages add up to more than 100 percent
because respondents could report more than one disability.) Significantly more
inmates than householders reported having a learning disability or a mental or
emotional condition. Almost four times as many inmates as householders
reported a learning disability and three times as many report a mental or
emotional condition. While there are no significant differences on the three
scales between the proficiency scores of inmates and householders with a
mental or emotional condition, there are for a learning disability, with the
inmates attaining significantly lower scores. The difference is particularly
pronounced on the quantitative scale where inmates’ proficiency is over
30 points lower than that of householders. Proficiency scores for inmates
reporting a learning disability are significantly lower on all three scales than
the scores of inmates with all other disabilities except visual impairment. With
the exception of those with a learning disability, inmates with particular
disabilities do not demonstrate lower literacy skills than their counterparts
in the household population with the same disabilities.

Regression Analyses
Of particular concern when comparing the prison and the household
populations is the difference in demographic composition of the two
populations. As noted earlier, the prison population has greater proportions
of males, minorities, younger adults, and adults with lower levels of education.
In addition, when the two populations are compared by some of these
characteristics, such as educational attainment and race/ethnicity, the prison
population demonstrates proficiencies that are comparable to or higher than
those of the household population. Thus, in order to account for the
differences in the demographic composition of the two populations, regression
analyses were run in which the variables of sex, race/ethnicity, age, and level
of education were held constant, with the outcome being proficiency on each
of the three scales.
The results of the regression for performance on the three scales are
presented in table 2.12. For the variable of race/ethnicity, the category of other
includes American Indian, Alaskan Native, Asian, Pacific Islander, and other.
Age was entered as a continuous variable, that is, the actual age of the
respondent. Education was entered as a scale variable with the following
breakdowns: 0 to 8 years, 9 to 12 years, GED certificate, high school diploma,
some college education, and college degree.

34 . . . . . . Chapter 2

TABLE 2.11
Percentages and Average Proficiencies on Each Literacy Scale
of Prison and Household Populations, by Various Disabilities
LITERACY SCALES

DISABILITIES BY
POPULATIONS

n

No Disability
Prison
Household
General Condition
Prison
Household
Visual Impairment
Prison
Household
Hearing Impairment
Prison
Household
Learning Disability
Prison
Household
Mental/Emotional
Prison
Household
Physical Disability
Prison
Household
Long-Term Disability
Prison
Household
Other Disability
Prison
Household

1,136

WGT N
(/1,000)

766

24,832 190,524
1,136

766

24,832 190,524
1,136

766

24,832 190,524
1,136

766

24,832 190,524
1,136

766

24,832 190,524
1,136

766

24,832 190,524
1,136

766

24,832 190,524
1,136

766

24,832 190,524
1,136

766

24,832 190,524

Prose

Document

Quantitative

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

63
253
74
283

(
(
(
(

1.7)
2.3)
0.5)
0.6)

63
247
74
278

(
(
(
(

1.7)
2.7)
0.5)
0.6)

63
244
74
283

(
(
(
(

1.7)
3.7)
0.5)
0.7)

14
230
12
227

(
(
(
(

1.3)
5.1)
0.3)
1.6)

14
225
12
219

(
(
(
(

1.3)
6.2)
0.3)
1.9)

14
215
12
220

(
(
(
(

1.3)
6.4)
0.3)
2.4)

7
210
7
217

(
(
(
(

0.7)
8.3)
0.2)
2.4)

7
207
7
212

(
(
(
(

0.7)
9.7)
0.2)
2.6)

7
195
7
210

(
(
(
(

0.7)
8.9)
0.2)
2.7)

6
225
7
243

(
(
(
(

0.6)
9.0)
0.3)
2.6)

6
238
7
236

( 0.6)
(10.2)
( 0.3)
( 2.8)

6
225
7
242

( 0.6)
(10.4)
( 0.3)
( 3.6)

11
189
3
207

(
(
(
(

1.0)
6.2)
0.1)
3.8)

11
183
3
201

(
(
(
(

1.0)
6.9)
0.1)
4.1)

11
166
3
198

(
(
(
(

1.0)
8.2)
0.1)
4.3)

6
228
2
225

(
(
(
(

0.9)
8.5)
0.1)
4.9)

6
229
2
223

(
(
(
(

0.9)
8.7)
0.1)
4.7)

6
212
2
214

(
(
(
(

0.9)
9.6)
0.1)
5.8)

9
244
9
231

(
(
(
(

1.0)
7.4)
0.3)
1.8)

9
238
9
222

(
(
(
(

1.0)
9.2)
0.3)
2.1)

9
233
9
223

(
(
(
(

1.0)
8.8)
0.3)
2.4)

8
237
8
236

(
(
(
(

0.9)
7.2)
0.2)
2.4)

8
229
8
225

(
(
(
(

0.9)
8.9)
0.2)
2.3)

8
225
8
227

( 0.9)
(10.8)
( 0.2)
( 2.7)

6
239
6
237

(
(
(
(

0.8)
8.4)
0.3)
2.6)

6
226
6
226

( 0.8)
(10.3)
( 0.3)
( 2.4)

6
226
6
232

( 0.8)
(11.1)
( 0.3)
( 3.2)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

Chapter 2 . . . . . . 35

Table 2.12
Results of Multiple Regression Analyses

SCALE AND
INDEPENDENT
VARIABLES*

Average
(Regression
Coefficient)

Adjusted
Standard
Error

T
Statistic

Probability

Prose
Intercept
Female
Black
Hispanic
Other
Age
Education
Prison
___________________
Multiple Correlation

221.4
5.1
-39.3
-54.4
-46.9
-0.7
22.3
10.3
_______
0.714

2.3
1.0
1.6
1.5
2.6
0.0
0.3
6.7

96.3
5.2
-25.2
-35.3
-18.0
-24.7
65.4
1.6

0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.121

Document
Intercept

234.3

2.2

104.7

0.000

Female
Black
Hispanic
Other
Age
Education
Prison
___________________
Multiple Correlation

0.2
-41.5
-53.0
-39.4
-1.0
21.3
4.8
_______
0.717

0.9
1.5
1.6
2.8
0.0
0.3
6.6

0.3
-28.5
-32.4
-13.9
-34.8
65.4
0.7

0.792
0.000
0.000
0.000
0.000
0.000
0.465

Quantitative
Intercept

223.2

2.1

103.1

0.000

Female
Black
Hispanic
Other
Age
Education
Prison
___________________
Multiple Correlation

-7.2
-50.6
-56.5
-38.5
-0.7
23.2
0.9
_______
0.702

1.0
1.7
1.9
2.9
0.0
0.3
7.3

-7.4
-30.7
-29.2
-13.4
-21.4
75.7
0.1

0.000
0.000
0.000
0.000
0.000
0.000
0.907

* Female, Black, Hispanic, other, and prison are 1 degree of freedom contrasts. Males, White, and household are the
comparison groups.
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Survey, 1992.

36 . . . . . . Chapter 2

When the variables are held constant, there is no significant statistical
difference in performance on all three scales between the prison and the
household populations. The characteristic having the most effect on
performance is level of education, followed by race/ethnicity. Thus, when
comparisons are made between the prison and household populations, it is
important to remember that differences in overall performance are most
likely attributable to differences in the demographic composition of the two
populations. On the other hand, the differences in demographics are important
and should not minimize the significance of the overall low performance of the
prison population, which comprises many individuals who demonstrate the
need for improved literacy skills.

Summary
The demographic composition and educational attainment of the prison
population differ significantly from that of the household population, with the
prison population more likely to be male, minority, young, and less educated.
Demonstrated performance on the three literacy scales also differs
significantly. The proficiency scores of the total prison population are some
27 points lower than those of the household population on the prose and
document scales, and 35 points lower on the quantitative scale. This lower
performance among inmates is also evident when the percentages of the two
populations performing in each of the levels are compared. Significantly more
inmates than householders perform in Levels 1 and 2. At the two higher levels,
the trend is reversed: significantly more householders than inmates perform in
Levels 4 and 5.
When the literacy survey results are compared for individuals with varying
levels of education, a strong relationship between education and literacy is
apparent. Both inmates and householders who had earned high school
diplomas demonstrate significantly higher average prose, document, and
quantitative proficiencies than do those who did not complete high school,
and individuals who had completed at least some college perform better, on
average, than those with high school diplomas. When the prison and household
populations are compared by level of education, however, there are some
differences in performance. The proficiency scores of prisoners who have a
high school diploma are significantly lower on all three scales than the
proficiencies of householders with a high school diploma. On the other hand,
inmates with a GED demonstrate about the same proficiencies as householders
with a GED.

Chapter 2 . . . . . . 37

Differences in performance are also evident across the racial/ethnic
groups studied within the prison population and when the groups are
compared across the prison and household populations. The average prose,
document, and quantitative proficiencies of White inmates are significantly
higher than those of Black, and the proficiencies of Black inmates are
significantly higher than those of Hispanic inmates. When racial/ethnic groups
are compared across the prison and household populations, the proficiency
scores of White inmates are significantly lower on all three scales than those of
White householders. Black and Hispanic inmates, however, demonstrate about
the same proficiencies as their counterparts in the household population.
The differences in performance between the prison and household
populations may be explained in part by differences in educational attainment
and racial/ethnic composition. When compared by these two variables, inmates’
literacy proficiencies are similar to or better than those of the householders.
Thus, the lower literacy proficiencies of the total prison population in
comparison with the household population may be related to the larger
proportion of inmates from minority racial/ethnic groups and of inmates with
lower levels of education. It is important to note, however, that these data do
not imply that all minority group members score at the lower levels on all three
scales or that the cause for lower performance is to be explained by the race/
ethnicity variable.
No differences in proficiencies are apparent between male and female
inmates. When compared with the household population, both male and
female prisoners demonstrate lower proficiencies than their counterparts on all
three scales. Similarly, inmates in all three age groups demonstrate about the
same proficiencies on the prose and quantitative scales, but inmates under
age 35 demonstrate higher document proficiency than those 35 and older.
When inmates within an age group are compared with their counterparts
in the household population, inmates demonstrate lower proficiencies.
Significantly more inmates than householders reported having at least one
disability, with a greater percentage of inmates than householders reporting a
learning disability or a mental or emotional condition. The proficiency scores of
inmates with a learning disability are significantly lower than those of the
householders reporting a similar disability.

38 . . . . . . Chapter 2

CHAPTER 3
Experiences Before Prison

T

he personal histories of all adults have much to tell us about the influence of
environmental factors upon the chances of success and failure in later life. The
detailed information collected on prisoners in this and other surveys — such as
those conducted by the Bureau of Justice Statistics — can help illuminate the
paths connecting background experiences with literacy proficiency. Among the
most important factors to consider in examining such connections are
educational experiences, home environment, and occupation and income
before incarceration.

Educational Experiences
As noted in Chapter 2, a greater percentage of the prison population have
lower levels of education compared with the household population. This
explains, in part, the differences in literacy proficiency across the three scales.
As is true for previous ETS adult literacy assessments, educational attainment
is the strongest predictor of literacy proficiency: the more formal education one
has, the higher one tends to perform on all three scales. Another facet of
educational experiences that will be looked at in relation to literacy proficiency
is the reason given by prisoners for dropping out of school.
Educational Attainment
As indicated in table 3.1, about one-half of the prisoners have not achieved
their high school diplomas, with the largest percentage (35 percent) having
completed 9 to 12 years of formal schooling. About the same percentage of
inmates received a high school diploma (14 percent) or GED (17 percent).
One-fifth of the inmates have some postsecondary education.
In general, the higher the educational attainment the higher the average
proficiency on the three scales. Those with 9 to 12 years of education
outperform those with 0 to 8 years of schooling by about 35 points, on average,

Chapter 3 . . . . . . 39

on the prose and quantitative scales and by almost 55 points, on average, on the
document scale. Prisoners who have completed a high school diploma and
GED attain higher average proficiency scores on all three scales compared
with those with 9 to 12 years of schooling. Inmates with a GED demonstrate
about the same proficiencies as inmates with a high school diploma. Prisoners
with at least some postsecondary education outperform those with a high
school diploma on all three scales but outperform inmates with a GED only on
the document scale.
Reason for Dropping Out of School
Those inmates who reported that they had left school before receiving a high
school diploma or who reported receiving a GED were asked for the main
reason they stopped their schooling when they did. The options that were
available were those used in the NAEP young adult literacy survey. (See
appendix D for a full description of the options.) As also shown in table 3.1, the
most prevalent reason given by prisoners (in about one-third of the cases) for
dropping out of school was loss of interest or academic difficulty. The average
proficiency scores of prisoners citing this reason, as well as of those who
dropped out because they were convicted of a crime, are significantly higher on
the prose and document scales than the scores of the 10 percent who reported
dropping out for financial reasons. No significant differences in quantitative
literacy are found when comparing these same groups.

Home Environment
Several questions were asked about the inmate’s home background. Two
influences — level of parental education and language spoken in the home
while growing up — are discussed below.
Level of Parental Education
Previous work investigating the intergenerational nature of literacy has
revealed the major role that parents’ educational attainment plays in their
children’s success in school. The results of other literacy assessments have
demonstrated that the educational attainment of parents acts as a significant
predictor of an individual’s literacy performance.1 Accordingly, both inmates

1

I.S. Kirsch and A. Jungeblut. (September 1992). Profiling the Literacy Proficiencies of JTPA and ES/UI
Populations: Final Report to the Department of Labor. Princeton, NJ: Educational Testing Service. I.S.
Kirsch and A. Jungeblut. (1986). Literacy: Profiles of America’s Young Adults. Princeton, NJ: Educational
Testing Service.

Chapter 3 . . . . . . 41

and householders were asked to indicate the highest level of education that
each of their parents had completed, and the highest level of education
attained by either parent was used as the parental education level.
The prison population differs from the household in that a greater
percentage of prisoners than householders attained lower levels of education
than their parents — 39 compared with 21 percent (table 3.2). On the other
hand, a lower percentage of prisoners than householders attained higher levels
of education than their parents — 30 and 43 percent, respectively. As can be
seen in table 3.3, a greater percentage of prisoners than their parents have less
than a high school diploma — 49 and 36 percent, respectively. In contrast, a
lower percentage of householders than their parents have less than a high
school diploma — 24 and 32 percent, respectively.
Not only do prisoners attain lower levels of education than their parents
overall, but a greater percentage of prisoners’ parents attained lower levels of

TABLE 3.2
Percentages of Inmates and Householders Reporting Lower, Equal,
and Higher Levels of Education Than Their Parents
COMPARISON

POPULATION
Lower than parents

Householders

Higher than parents

WGT N
(/1,000)

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

581

39 ( 1.5)

31 ( 1.5)

30 ( 1.5)

21,719 163,141

21 ( 0.4)

35 ( 0.4)

43 ( 0.4)

n

Inmates

Same as parents

885

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); RPCT = row percentage estimate; (SE) = standard error of the estimate (the true population value can be said to be
within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

42 . . . . . . Chapter 3

TABLE 3.3
Percentages of Inmates and Householders Reporting
Personal Education Level and Parental Education Level
POPULATION

EDUCATION LEVEL
Inmates

Inmates’ parents

Householders

Householders’
parents

CPCT ( SE )

CPCT ( SE )

CPCT ( SE )

CPCT ( SE )

0 to 12 years

49 ( 1.1)

36 ( 1.5)

24 ( 0.2)

32 ( 0.5)

H.S. diploma or GED

31 ( 1.0)

39 ( 1.6)

32 ( 0.1)

32 ( 0.4)

Postsecondary

20 ( 0.7)

25 ( 1.4)

44 ( 0.2)

36 ( 0.4)

CPCT = column percentage estimate; (SE) = standard error of the estimate (the true population value can be said to be within 2 standard
errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

education than householders’ parents. Thirty-six percent of prisoners’
parents have less than a high school diploma, compared with 32 percent of
householders’ parents. On the other hand, 25 percent of prisoners’ parents
have any college education, compared with 36 percent of householders’ parents.
The association of higher literacy proficiency with increased level of
parental education is evident from table 3.4. The connection between higher
proficiency scores and higher levels of parental education is not quite as strong,
however, for the prison population as it is for the household population. For
both the prison and household populations, those whose parents had 9 to
12 years of education attain higher scores on all three scales than those whose
parents had 0 to 8 years of education. While for the household population, the
scores of those whose parents had a high school diploma or GED are
significantly higher than the scores of those whose parents had 9 to 12 years
on all three scales, this is true for the prison population only on the document
scale. For the prison population, the proficiencies of those whose parents had
at least some postsecondary education are higher on the prose and document
scales than the proficiencies of those whose parents had a high school diploma
or GED; for the household population, the proficiencies are higher for the
postsecondary level on all three scales.

Chapter 3 . . . . . . 43

TABLE 3.4
Average Proficiencies on Each Literacy Scale
of Prison and Household Populations, by Level of Parental Education
LITERACY SCALES

PARENTAL
EDUCATION LEVEL
BY POPULATIONS

Prose

n

Document

Quantitative

WGT N
(/1,000)

PROF ( SE )

PROF ( SE )

PROF ( SE )

0 to 8 Years
Prison
Household

179
4,727

120
38,429

222 ( 5.4)
233 ( 1.5)

211 ( 5.9)
225 ( 1.6)

211 ( 6.9)
233 ( 1.7)

9 to 12 Years
Prison
Household

131
2,253

90
16,417

250 ( 4.7)
264 ( 1.7)

242 ( 4.8)
258 ( 1.7)

237 ( 7.1)
264 ( 2.0)

High School/GED
Prison
Household

342
7,491

226
55,289

259 ( 2.2)
283 ( 0.8)

256 ( 2.8)
279 ( 0.8)

249 ( 3.4)
284 ( 0.9)

Postsecondary
Prison
Household

236
8,226

147
61,469

271 ( 3.7)
307 ( 0.8)

270 ( 3.6)
303 ( 0.8)

261 ( 4.5)
305 ( 1.1)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); PROF = average proficiency estimate; (SE) = standard error of the estimate (the true population value can be said to
be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

By level of parental education, the household population demonstrates
higher proficiency than does the prison population. For example, as shown in
table 3.4, the proficiencies of householders whose parents have a high school
diploma or GED are 23 to 35 points higher on the three scales than the
proficiencies of prisoners whose parents have the same level of education —
24 points higher on the prose scale, 23 points higher on the document, and
35 on the quantitative. Such differences may be attributable, in part, to the
tendency of the inmate population to have lower levels of education than both
their parents and householders. In some cases when the level of education for
inmates, householders, and parents is the same, such differences disappear.
Table 3.5 shows the average proficiencies across the three scales for prisoners
and householders who have the same level of education as each other and as
their parents. Thus, for prisoners and householders whose parents had 0 to

44 . . . . . . Chapter 3

TABLE 3.5
Average Proficiencies on Each Literacy Scale of Inmates and Householders
Reporting Same Level of Education as Their Parents
LITERACY SCALES

EDUCATION LEVEL
BY POPULATIONS

Prose

n

Document

Quantitative

WGT N
(/1,000)

PROF ( SE )

PROF ( SE )

PROF ( SE )

0 to 12 Years
Inmates
Householders

158
2,435

111
21,056

210 ( 4.8)
200 ( 1.9)

199 ( 5.8)
194 ( 2.2)

198 ( 7.1)
197 ( 2.4)

H.S. Diploma/GED
Inmates
Householders

120
2,593

83
21,440

267 ( 3.4)
275 ( 1.5)

262 ( 4.7)
271 ( 1.5)

257 ( 5.5)
277 ( 1.6)

Postsecondary
Inmates
Householders

98
5,972

54
39,974

299 ( 6.9)
322 ( 1.0)

292 ( 6.1)
317 ( 0.9)

285 ( 7.5)
320 ( 1.3)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); PROF = average proficiency estimate; (SE) = standard error of the estimate (the true population value can be said to
be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

12 years of education and who themselves have 0 to 12 years, the proficiencies
of prisoners and householders are about the same on all three scales. The same
is true for prisoners and householders with a high school diploma or GED for
the document scale.

Language Background
Inmates were asked several questions about their language background.
Among them, they were to indicate the language or languages that were usually
spoken in their home when they were growing up. As shown in table 3.6,
80 percent of the inmates grew up in homes where English only was spoken,
while about 9 percent grew up in homes where English and another language
were spoken and 11 percent in homes where only a language other than
English was spoken. The relationship of language in the homes of prisoners
to literacy proficiency is clear. Those who lived in English-only or Englishbilingual households attain similar proficiency scores but outperform by 65 to

Chapter 3 . . . . . . 45

TABLE 3.6
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates, by Language Spoken in the Home
LITERACY SCALES

LANGUAGE

n

WGT N
(/1,000)

English

907

616

English and other

112

68

Other

127

81

Prose

Document

Quantitative

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

80
253
9
259
11
180

(
(
(
(
(
(

0.9)
2.2)
0.9)
4.4)
0.7)
6.6)

80
248
9
256
11
165

(
(
(
(
(
(

0.9)
2.5)
0.9)
5.5)
0.7)
6.7)

80
242
9
245
11
178

(
(
(
(
(
(

0.9)
3.2)
0.9)
5.6)
0.7)
8.5)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

90 points those who grew up in homes where English was not spoken. For
example, on the prose scale those who grew up in homes where English only or
English and another language were spoken average about 255, whereas those
from non-English speaking homes average 180. These inmates’ average
proficiencies on all three scales indicate that they demonstrate skills associated
with only the most rudimentary tasks.

Occupation and Income
The employment histories of prisoners may refute the popular conception that
a life of crime is an occupation in itself. Two-thirds of prisoners reported in the
Bureau of Justice Statistics survey that they were working in the month prior
to being arrested for their current offense.2 For the National Adult Literacy
Survey, inmates who had been admitted to prison after December 1988 were
asked if they had been working and, if so, what their occupation was. As shown
in table 3.7, the vast majority (85 percent) of prisoners who were working prior
2

A. Beck, et al. (March 1993). Survey of State Prison Inmates, 1991. Washington, DC: U.S. Department of
Justice, p. 3.

46 . . . . . . Chapter 3

TABLE 3.7
Percentages and Average Proficiencies on Each Literacy Scale of Inmates,
by Occupation Category and Income Before Incarceration
LITERACY SCALES

OCCUPATION AND INCOME
Prose

Document

Quantitative

WGT N
(/1,000)

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

661

435

100 ( 0.0)
248 ( 2.5)

100 ( 0.0)
246 ( 3.2)

100 ( 0.0)
240 ( 4.0)

34

20

4 ( 0.7)
*** ( ****)

4 ( 0.7)
*** ( ****)

4 ( 0.7)
*** ( ****)

74

44

Craft or service

288

191

Assemblers, laborers, farm,
or transportation

264

180

Monthly Income
$0 to $499

143

94

$500 to $999

202

134

$1,000 to $1,499

159

106

$1,500 or more

147

95

n

Total Population
Total
Occupation Category
Professional
Sales or
administrative support

10
263
44
251

(
(
(
(

1.4)
9.0)!
2.3)
4.2)

10
260
44
249

( 1.4)
(11.7)!
( 2.3)
( 4.3)

10
258
44
244

( 1.4)
(12.3)!
( 2.3)
( 5.6)

41 ( 2.1)
239 ( 3.9)

41 ( 2.1)
237 ( 4.5)

41 ( 2.1)
226 ( 5.9)

22
241
31
241
25
254
22
262

22
242
31
238
25
251
22
258

22
227
31
232
25
249
22
255

(
(
(
(
(
(
(
(

2.0)
5.0)
2.2)
5.5)
1.9)
4.7)
1.8)
5.3)

(
(
(
(
(
(
(
(

2.0)
5.2)
2.2)
6.0)
1.9)
6.0)
1.8)
5.9)

(
(
(
(
(
(
(
(

2.0)
6.3)
2.2)
6.8)
1.9)
7.1)
1.8)
5.8)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
*** Sample size is insufficient to permit a reliable estimate (fewer than 45 respondents).
! Interpret with caution -- the nature of the sample does not allow accurate determination of the variability of this statistic.
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

Chapter 3 . . . . . . 47

to incarceration held jobs in craft, service, assembly, labor, or transportation
categories. Prisoners who held jobs in sales or administrative support
demonstrate higher proficiency than those who held jobs in assembly, labor,
farm, or transportation occupations only on the prose scale. Inmates in each
category perform, on average, in Level 2 on the literacy scales.
Prisoners who had been admitted after December 1988 were also asked
what their average monthly earnings were for the job they held. About half of
the prisoners reported earning less than $1,000 per month or $12,000 a year
(table 3.7). The proficiency scores of those who reported earning $1,500 per
month or more are significantly higher on both the prose and quantitative
scales than the scores of those who reported less than $1,000. The differences
in document proficiency are not significant between any of the income groups.

Summary
The National Adult Literacy Survey results reveal that certain background
factors are related to literacy proficiency. Generally the higher the level of
education, the higher the proficiency scores on the three scales. The most
prevalent reason for dropping out of school was loss of interest or academic
difficulty. Inmates who left school for this reason demonstrate higher
proficiencies than inmates who dropped out of school for financial reasons.
Level of parental education has a similar relationship to proficiency as
does the inmates’ own level of education: generally, the higher the level of
parental education, the higher the inmates’ proficiencies. The effects of low
levels of parental education can be compensated for, however, if individuals
attain higher levels of education than their parents. Prisoners, however, are
relatively disadvantaged in that they attain lower levels of education than their
parents and their parents have attained less formal education than is typical of
parents in the household population.
When inmates come from homes where only a non-English language was
spoken, their literacy proficiencies are significantly lower than those who come
from homes where English was spoken.
The relationship of two other factors — occupation and income — to
literacy was also explored. Few inmates held professional jobs before incarceration.
The only significant difference in proficiency occurs on the prose scale where
those in sales and administrative support are compared with those in the
assembly, labor, farm, or transportation occupations. The proficiencies of
inmates are not significantly different when compared by other occupational
categories. Inmates who earned $1,500 or more per month demonstrated
higher prose and quantitative proficiencies than those who earned less than
$1,000 per month
48 . . . . . . Chapter 3

CHAPTER 4
Experiences Unique to Prison Life

P

opular conceptions of life behind prison walls may be highly influenced by
the extent to which prisons are seen as places of punishment or rehabilitation.
In either case, until recent studies, little has been known about how prisoners
spend their time on a daily basis. Through information gathered from the
National Adult Literacy Survey, the Bureau of Justice Statistics (BJS) survey of
state prisoners, and the General Accounting Office (GAO) report on federal
prisoners, a much clearer picture emerges of the diverse activities in which
prisoners participate. Many of these activities in education and vocational
training may have implications for reducing recidivism and preparing prisoners
to rejoin the general population. Other activities — including participation in
social organizations and work duties — provide possibilities for selfimprovement. Such opportunities also provide specific contexts for the
development and reinforcement of the types of literacy skills profiled in this
survey. This chapter explores the relationship between literacy proficiency and
prison experiences as well as type of offense and length of sentence.
The BJS 1991 survey of 13,986 prisoners (representing 711,000 prisoners
in state correctional facilities) revealed that “nearly all inmates had participated
in work, education, or other programs since their admission to prison”.1 Around
80 percent reported that they were currently participating in a program or
activity. About half reported having received academic education and one-third,
vocational training, since entering prison. The GAO survey of 2,925 federal
prisoners (representative of around 65,000 federal prisoners in 1992) asked
prisoners to rate factors accounting for their participation in education and
vocational programs. Over 70 percent rated opportunity for self-improvement
as the number one reason for their participation in educational programs.2
The second most highly rated reason (at around 60 percent) was to obtain
1

A. Beck, et al. (March 1993). Survey of State Prison Inmates, 1991. Washington, DC: US Department of
Justice, p. 27.
2
H.A. Valentine. (January 1993). Federal Prisons: Inmate and Staff Views on Education and Work Training
Programs. Report to the Chairman, Select Committee on Narcotics Abuse and Control, House of
Representatives, p. 9.

Chapter 4 . . . . . . 49

marketable skills. Two other reasons also highly rated by 40 to 50 percent of
respondents were that participation in educational or vocational classes was a
way to reduce chances of returning to prison and that such courses were a
challenge. The reason for taking educational or vocational programs rated the
least significant was bored/filled time (about 15 percent).

Literacy Proficiency by Type of Offense
Prisoners were asked to indicate for what offenses they were currently in
prison. If they indicated more than one, they were asked for which offense they
received the longest sentence, and that offense became the reported offense.
As shown in table 4.1, the greatest percentage of inmates, 44 percent, reported
that they were serving time for violent crimes, which include homicide, rape,

TABLE 4.1
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates, by Current Offense
LITERACY SCALES

CURRENT
OFFENSE

n

Prose

Document

Quantitative

WGT N
(/1,000)

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

Total Population
Total

1,106

738

100 ( 0.0)
246 ( 1.9)

100 ( 0.0)
240 ( 2.1)

100 ( 0.0)
236 ( 2.9)

Current Offense
Violent

480

322

Property

202

133

Drugs

287

189

Public order

137

93

44
246
18
259
26
237
13
246

44
240
18
256
26
230
13
240

44
235
18
246
26
233
13
235

(
(
(
(
(
(
(
(

1.9)
3.1)
1.2)
4.1)
1.5)
4.9)
1.1)
4.3)

(
(
(
(
(
(
(
(

1.9)
3.4)
1.2)
3.7)
1.5)
5.9)
1.1)
5.4)

(
(
(
(
(
(
(
(

1.9)
4.7)
1.2)
4.5)
1.5)
6.6)
1.1)
6.0)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

50 . . . . . . Chapter 4

sexual assault, robbery, kidnapping, and assault. Those convicted of drug
offenses make up the second largest group at 26 percent of the prison
population, followed by property offenses (18 percent), which include such
crimes as burglary, larceny, auto theft, fraud, embezzlement, arson, and stolen
property. Those convicted of public order offenses, which include weapons
offense, rioting, contempt of court, morals/decency offense, probation and
parole violations, and minor traffic violations compose the smallest group,
13 percent. Drug offenders demonstrate lower prose and document
proficiencies than property offenders, but about the same quantitative
proficiency. Violent offenders also demonstrate lower proficiencies than
property offenders on the document scale. The proficiencies of other types
of offenders are not statistically different from one another.

Literacy and Length of Prison Sentence
As shown in table 4.2, over one-half of the prisoners were sentenced to prison
for five years (60 months) or less, while 9 percent do not expect to be released.
There are no significant differences in average proficiency scores of inmates
when they are compared with respect to length of sentence, with the exception
that the scores of those with a sentence of five years or less are higher on the
prose scale than the scores of those with a sentence of more than 10 years, as
well as higher on the document scale than the scores of those who do not
expect to be released.

Participation in Educational and Vocational Programs
Prisoners were asked about their participation in educational and vocational
training programs while in prison. As shown in table 4.3, almost two-thirds of
the prisoners have engaged in either educational and/or vocational training
programs since incarceration for their current offense. Nevertheless, fewer
prisoners (13 percent) participate only in vocational classes than in either
education (30 percent) or both types of classes (20 percent). On the prose
scale, the average proficiency of those involved only in vocational training is
significantly higher than the proficiency of those who participate in both
vocational and education programs (265 and 239, respectively). The vocationalonly group also performs significantly higher (265) on the prose scale than
those who participate in no classes at all (246) and those involved only in
education classes (242). For the document scale, prisoners only in vocational
classes attain higher proficiency than prisoners in both. The quantitative
proficiency of inmates in vocational-only training is higher than the

Chapter 4 . . . . . . 51

TABLE 4.2
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates, by Length of Sentence
LITERACY SCALES

LENGTH OF SENTENCE
Prose

Document

Quantitative

WGT N
(/1,000)

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

1,146

765

100 ( 0.0)
246 ( 1.9)

100 ( 0.0)
240 ( 2.2)

100 ( 0.0)
236 ( 3.1)

675

442

61 to 120

191

128

121 or more

115

81

Do not expect to be released

97

66

Don't know

68

48

58
251
17
247
11
232
9
236
6
233

58
247
17
240
11
231
9
224
6
218

58
240
17
240
11
225
9
221
6
224

n

Total Population
Total
Sentence in Months
0 to 60

(
(
(
(
(
(
(
(
(
(

1.9)
2.2)
1.3)
4.9)
1.0)
5.7)
0.9)
6.4)
0.9)
9.1)

( 1.9)
( 2.1)
( 1.3)
( 5.4)
( 1.0)
( 6.0)
( 0.9)
( 7.4)
( 0.9)
(12.9)

( 1.9)
( 2.8)
( 1.3)
( 7.2)
( 1.0)
( 7.3)
( 0.9)
( 8.0)
( 0.9)
(14.8)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

proficiencies of those in both kinds of programs and education only, but not
higher than the proficiency of those not enrolled in any program. Although
those who participate in vocational programs generally demonstrate higher
proficiencies, they still are performing, on average, only in Level 2.

Prison Work Experiences
Prisoners were also asked whether they were currently involved in work
assignments either inside or outside the prison facility. As shown in table 4.4,
69 percent of the inmates reported being assigned to work duties. The
proficiency scores of those who work in prison are significantly higher on all

52 . . . . . . Chapter 4

TABLE 4.3
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates, by Participation in Education and/or Vocational Programs
LITERACY SCALES

PROGRAM
Prose

Document

Quantitative

WGT N
(/1,000)

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

1,144

763

100 ( 0.0)
246 ( 1.9)

100 ( 0.0)
240 ( 2.2)

100 ( 0.0)
236 ( 3.0)

Program
No participation in either

425

281

Education classes only

340

227

Vocational classes only

150

102

Both kinds of classes

229

152

37
246
30
242
13
265
20
239

37
240
30
237
13
254
20
235

37
239
30
230
13
253
20
226

n

Total Population
Total

(
(
(
(
(
(
(
(

1.7)
3.1)
1.6)
4.6)
1.2)
4.3)
1.4)
4.0)

(
(
(
(
(
(
(
(

1.7)
3.8)
1.6)
5.2)
1.2)
4.3)
1.4)
4.2)

(
(
(
(
(
(
(
(

1.7)
4.1)
1.6)
6.7)
1.2)
5.4)
1.4)
4.5)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

three literacy scales than the scores of those who do not work. Those who work
score 11, 12, and 18 points higher, on average, on the prose, document, and
quantitative scales, respectively.
Inmates were asked to indicate not only whether they were working in
prison, but also the types of work in which they were engaged. Several areas of
work were listed, and respondents were asked to indicate all those in which
they were involved. These work assignments included goods production,
janitorial, grounds keeping, food preparation, other services (library, store,
office help, recreation), maintenance, and other unspecified jobs. In addition to
the other unspecified work, janitorial work was the most frequently reported
work assignment (except the difference between janitorial and food
preparation did not reach statistical significance). Five percent said that they
were enrolled in school as a work assignment. There appears to be no

Chapter 4 . . . . . . 53

TABLE 4.4
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates, by Work Experience in Prison
LITERACY SCALES

WORK EXPERIENCE
Prose

Document

Quantitative

WGT N
(/1,000)

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

1,146

765

100 ( 0.0)
246 ( 1.9)

100 ( 0.0)
240 ( 2.2)

100 ( 0.0)
236 ( 3.0)

795

526

351

239

69
249
31
238

(
(
(
(

2.5)
2.2)
2.5)
4.0)

69
244
31
232

(
(
(
(

2.5)
2.3)
2.5)
4.7)

69
241
31
223

(
(
(
(

Work Assignments
Do not work

351

239

Goods production

50

35

Janitorial

155

106

Grounds

88

61

125

85

77

48

104

69

59

38

207

133

31
238
5
228
14
236
8
236
11
254
6
257
9
255
5
240
17
264

(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(

2.5)
4.0)
0.7)
9.3)
1.1)
5.4)
1.1)
7.5)
1.2)
4.9)
0.7)
5.1)
1.0)
6.2)
0.9)
7.1)
1.7)
4.8)

31
232
5
220
14
230
8
238
11
244
6
252
9
258
5
238
17
256

( 2.5)
( 4.7)
( 0.7)
( 9.4)
( 1.1)
( 4.7)
( 1.1)
( 9.9)
( 1.2)
( 6.0)
( 0.7)
( 6.4)
( 1.0)
( 5.5)
( 0.9)
(10.0)
( 1.7)
( 5.5)

31
223
5
214
14
222
8
240
11
248
6
253
9
251
5
228
17
256

( 2.5)
( 5.0)
( 0.7)
(11.0)
( 1.1)
( 5.1)
( 1.1)
( 8.4)
( 1.2)
( 6.9)
( 0.7)
( 7.6)
( 1.0)
( 5.3)
( 0.9)
( 7.7)
( 1.7)
( 5.3)

n

Total Population
Total
Do You Work?
Yes
No

Food preparation
Other services
Maintenance
Enrolled in school
Other

2.5)
2.9)
2.5)
5.0)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

54 . . . . . . Chapter 4

significant link between literacy proficiency and the kinds of work assignments
prisoners are involved in. The exception is that the proficiency of inmates
involved in janitorial work is lower on the quantitative scale than those of
inmates in other services, maintenance, and other unspecified jobs as well as
lower on the document scale than those of inmates in maintenance and other
jobs and on the prose scale of inmates in other jobs. On the other hand, when
the proficiencies of those involved in particular work assignments are
compared with the proficiencies of those who do not work, some differences
are apparent. On the quantitative scale, the proficiencies of prisoners involved
in maintenance, other services, and other unspecified jobs are higher than
the proficiencies of those not working. In addition, on the document scale,
prisoners involved in maintenance and other jobs demonstrate higher
proficiency than those not working. The same holds true on the prose scale
for inmates working in other unspecified jobs.
Seventy-seven percent of those prisoners with postsecondary education
were working in prison as compared with 64 percent of those who had not
completed a GED or high school diploma (table 4.5). Although it appears that

TABLE 4.5
Percentages of Inmates Reporting Level of Education,
by Whether Working in Prison
WORKING IN PRISON

EDUCATION LEVEL
Working

Not working

WGT N
(/1,000)

RPCT ( SE )

RPCT ( SE )

1,142

762

69 ( 2.5)

31 ( 2.5)

0 to 12 years

541

377

64 ( 3.0)

36 ( 3.0)

H.S. diploma/GED

337

237

72 ( 3.3)

28 ( 3.3)

Postsecondary

264

149

77 ( 3.1)

23 ( 3.1)

n

Total

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); RPCT = row percentage estimate; (SE) = standard error of the estimate (the true population value can be said to be
within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

Chapter 4 . . . . . . 55

a greater percentage of those with a high school diploma or GED have work
assignments than do those with no diploma, the difference between the two
does not reach statistical significance. Having a high school diploma or GED
did not increase the likelihood of working in prison.

Joining Groups While in Prison
In addition to working or taking education and vocational classes, prisoners in
most cases have the opportunity to join groups while in prison. As shown in
table 4.6, 53 percent of prisoners reported joining groups of various types. The
three most frequently joined groups are addiction (29 percent), religious
(26 percent), and life skills (20 percent) groups.
The proficiency scores of prisoners who joined groups are significantly
higher than those of nonjoiners on all three scales. In addition, prisoners who
are involved in three or more groups (17 percent) demonstrate significantly
higher average proficiencies than those who joined only one or two groups,
except on the quantitative scale where the most involved prisoners perform
about the same as those involved in two groups. When the proficiency scores of
those in the various groups are compared with the scores of those who did not
join groups, only the proficiencies of those in religious groups are not
significantly higher on all three scales. When proficiencies among the various
groups are compared, generally prisoners demonstrate about the same
proficiencies on the three scales, regardless of the type of group joined.

Summary
Although there are differences in demonstrated proficiencies among inmates
who are involved in certain prison activities and those who are not, when
proficiencies are compared by the specific kinds of activities, there are few
significant differences in demonstrated proficiency. This would suggest that
literacy proficiency does not have a strong relationship with participation in a
specific program or group. The exception to this seems to be involvement in
education and/or vocational programs.
The following are summary highlights of the relationship between literacy
proficiency and life in prison.
y When proficiencies of inmates are compared by type of offense, there are no
significant differences except that inmates sentenced for property offenses
demonstrate higher proficiencies than those sentences for drug offenses on
the prose and document scales and than those sentenced for violent offenses
on the document scale.

56 . . . . . . Chapter 4

TABLE 4.6
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates Reporting Groups Joined in Prison
LITERACY SCALES

GROUPS JOINED

Prose

Document

Quantitative

WGT N
(/1,000)

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

1,145

764

100 ( 0.0)
246 ( 1.9)

100 ( 0.0)
241 ( 2.2)

100 ( 0.0)
236 ( 3.0)

619

406

526

358

53
252
47
240

(
(
(
(

2.1)
2.9)
2.1)
2.7)

53
247
47
233

(
(
(
(

2.1)
3.1)
2.1)
3.1)

53
243
47
228

(
(
(
(

2.1)
3.8)
2.1)
4.0)

526

358

Addiction

343

223

Religious

310

201

Life skills

235

151

Racial/Ethnic

52

34

Prisoner assistance

74

47

Outside community activities

25

15

Prerelease

37

25

Other

52

36

47
240
29
255
26
250
20
262
4
271
6
268
2
***
3
***
5
269

(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(

2.1)
2.7)
2.2)
4.1)
1.6)
4.6)
1.8)
4.2)
0.8)
6.8)
1.0)
6.9)
0.5)
****)
0.6)
****)
0.8)
6.8)

47
233
29
252
26
241
20
258
4
256
6
263
2
***
3
***
5
265

(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(

2.1)
3.1)
2.2)
4.0)
1.6)
4.3)
1.8)
4.6)
0.8)
7.6)
1.0)
5.0)
0.5)
****)
0.6)
****)
0.8)
6.9)

47
228
29
247
26
237
20
251
4
262
6
260
2
***
3
***
5
256

(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(
(

2.1)
4.0)
2.2)
5.1)
1.6)
5.0)
1.8)
5.1)
0.8)
7.9)
1.0)
5.6)
0.5)
****)
0.6)
****)
0.8)
7.7)

526

358

One

239

159

Two

178

119

Three or more

202

128

47
240
21
244
16
246
17
267

(
(
(
(
(
(
(
(

2.1)
2.7)
1.6)
4.0)
1.1)
5.0)
1.7)
4.3)

47
233
21
238
16
242
17
263

(
(
(
(
(
(
(
(

2.1)
3.1)
1.6)
4.3)
1.1)
5.6)
1.7)
4.4)

47
228
21
235
16
239
17
255

(
(
(
(
(
(
(
(

2.1)
4.0)
1.6)
4.8)
1.1)
6.7)
1.7)
5.0)

n

Total Population
Total
Have You Joined Any Group?
Yes
No
Types of Groups Joined
None

Number of Groups Joined
0

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
*** Sample size is insufficient to permit a reliable estimate (fewer than 45 respondents).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

Chapter 4 . . . . . . 57

y Those sentenced for five years or less demonstrate higher proficiency on the
prose scale than those sentenced for more than 10 years as well as on the
document scale than those who do not expect to be released.
y The proficiency scores of those only in vocational training programs are
higher than those of inmates who do not participate in any training program
and of those who participate in education only or in both types of programs.
Inmates in vocational-only programs, however, still perform, on average,
only in Level 2.
y Almost 70 percent of the inmates reported working and these inmates
demonstrate higher proficiencies than those not working.
y Over one-half the inmates reported being involved in at least one group in
prison, and they demonstrate higher proficiencies than those who are not
involved in groups.
y The scores of those who joined three or more groups generally are higher
than the scores of those who joined fewer groups.

58 . . . . . . Chapter 4

CHAPTER 5
Recidivism and Literacy

W

hen surveyed, prisoners were asked if they had ever been placed on
probation or served time in a jail, prison, juvenile, or other correctional facility
for another offense before their current term of confinement. Based upon
their responses, prisoners with a former sentence to either probation or
incarceration were classified as recidivists — or repeat offenders.
It is often asserted that prisoners with higher literacy levels are less likely
than prisoners with lower skill levels to be repeat offenders. Prisoners who can
read comprehensively, fill out forms, and analyze numbers are more likely to
develop high self-esteem, find employment, and be able to avoid criminal
behavior when released than those without those skills. Recent studies, such as
ones conducted by the state of Alabama, have generally concluded that inmates
who participate in educational and vocational programs while in prison are less
likely to return than those prisoners who do not attend school or training.1
These studies have tracked a cohort of prisoners with and without additional
education or training to see if they incur additional sentences after their
release. Since the National Adult Literacy Survey provides information on
prisoners at only one point in time, this type of before and after study of a test
and control group could not be done. An experimental design of this sort is
generally the preferred method of determining the effect of education or
vocational training on recidivism. This survey, therefore, cannot determine the
effect of education and training programs on prisoners after they are released
from a prison as an experimental design would.
Nor can this study definitively examine the effect of education on criminal
history or vice versa. In 1991, almost one-half of state prison inmates reported
they had received academic education and about one-third reported receiving

1

M. O’Neil. (1990). “Correctional Higher Education: Reduced Recidivism?” Journal of Correctional
Education, 41, 28-31. State of Alabama Department of Post-Secondary Education. (1992). “A Study of
Alabama Prison Recidivism Rates of Inmates Having Completed Vocational and Academic Programs While
Incarcerated Between the Years of 1987 through 1991.”

Chapter 5 . . . . . . 59

vocational training since entering prison for their current offense.2 For many
inmates with criminal records, a portion of their education was often obtained
during incarceration. This fact complicates analysis of the interaction of literacy
and recidivism. In this study the prose, document, and quantitative literacy
skills of the prison population, as well as their level of education, were assessed
at the time of interview, even though these factors may have improved during
current and previous incarcerations, while respondents’ criminal histories were
summed up over a period of time. Literacy skills at the time of interview were
evaluated for prisoners who reported previously being on probation or in a
correctional facility as well as for those in prison for their first conviction. To
determine the effect of literacy on recidivism, increases in literacy skills should
be measured along with a historical record of interactions with the criminal
justice system. This study, however, does not track literacy skills over a period
of time concurrently with the building of a criminal history to determine effects
of increased literacy skills on recidivism.
In this chapter, then, the current literacy proficiencies of prisoners and
prior records as reported to interviewers are examined together. In addition,
recidivism and literacy by educational attainment, race/ethnicity, and presence
of disabilities are discussed.

Prior Sentences of Prison Inmates
A majority of prison inmates reported having previous sentences to probation
or confinement in a jail, prison, juvenile, or other correctional facility (table
5.1). An estimated 77 percent had been either on probation or in a correctional
facility in the past. About 6 in 10 prison inmates had served probation. Almost
two-thirds had been sentenced to spend time in a correctional facility.
Prisoners who had been on probation or in a jail, prison, or juvenile facility
demonstrate about the same literacy skills as those who had no previous
criminal justice status (table 5.1). On all three scales, the average proficiency
scores of prisoners who had at some time in their lives served on probation or
in a correctional facility are about the same as those who had never been on
probation or incarcerated. Moreover, the proficiency scores of inmates who
started their criminal careers as juveniles are no different from the scores of
those who began their interactions with the criminal justice system as adults.
Although the proficiency scores on the prose scale range from 242 for those
with no prior sentence to 250 for those who served sentences as both juveniles
and adults, with juvenile-only and adult-only scores in between, the differences
2

A. Beck, et al. (March 1993). Survey of State Prison Inmates, 1991. Washington, DC: US Department of
Justice.

60 . . . . . . Chapter 5

TABLE 5.1
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates, by Recidivism
LITERACY SCALES

RECIDIVISM
Prose

Document

Quantitative

WGT N
(/1,000)

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

1,143

763

100 ( 0.0)
246 ( 1.9)

100 ( 0.0)
240 ( 2.2)

100 ( 0.0)
236 ( 3.0)

Probation
No probation

448

292

Juvenile only

153

105

Adult only

307

203

Both juvenile and adult

220

151

39
242
14
245
27
247
20
254

(
(
(
(
(
(
(
(

2.0)
3.0)
1.1)
4.7)
1.7)
3.5)
1.5)
4.8)

39
236
14
244
27
240
20
247

(
(
(
(
(
(
(
(

2.0)
3.5)
1.1)
5.0)
1.7)
3.6)
1.5)
4.5)

39
239
14
235
27
231
20
239

(
(
(
(
(
(
(
(

426

272

57

38

Adult only

405

275

Both juvenile and adult

236

164

36
244
5
244
37
245
22
252

(
(
(
(
(
(
(
(

1.7)
3.5)
0.5)
8.8)
1.4)
3.4)
1.4)
4.1)

36
239
5
228
37
241
22
244

(
(
(
(
(
(
(
(

1.7)
4.0)
0.5)
9.7)
1.4)
3.4)
1.4)
4.7)

36
237
5
226
37
236
22
237

( 1.7)
( 4.4)
( 0.5)
(10.3)
( 1.4)
( 4.4)
( 1.4)
( 4.7)

267

168

83

55

Adult only

405

272

Both juvenile and adult

371

257

22
242
7
247
36
245
34
250

(
(
(
(
(
(
(
(

1.5)
4.6)
0.8)
6.7)
1.6)
3.1)
1.5)
3.4)

22
236
7
237
36
241
34
244

(
(
(
(
(
(
(
(

1.5)
5.0)
0.8)
7.2)
1.6)
3.2)
1.5)
3.6)

22
238
7
236
36
235
34
236

(
(
(
(
(
(
(
(

n

Total Population
Total

Incarceration
No incarceration
Juvenile only

Probation and/or Incarceration
None for both
Juvenile only

2.0)
4.0)
1.1)
6.6)
1.7)
4.6)
1.5)
5.3)

1.5)
5.3)
0.8)
7.6)
1.6)
4.2)
1.5)
4.5)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

Chapter 5 . . . . . . 61

in scores are not statistically significant. There is a similar lack of relationship
among the scores on the other scales.
The proficiency scores of previously incarcerated inmates are about the
same as the scores of those who had never served a sentence in a correctional
facility. The greatest mean differences on each scale — 8 points on the prose
scale, 11 on the document scale, and 11 on the quantitative scale — do not
reach statistical significance given the relatively small sample size.
Inmates who had prior probation sentences demonstrate about the same
literacy proficiency as inmates serving time for their first offense. Here, too,
the scores on each of the three scales are not statistically different.
In conclusion, recidivists on average demonstrate about the same literacy
levels as inmates with no prior sentences. The proficiency scores of those who
started their criminal careers as juveniles, either on probation or in a facility,
are about the same as the scores of those who had never been on probation or
in a correctional facility and of those repeat offenders who started their
criminal careers as adults.

Number of Prior Sentences to
Probation and/or Incarceration and Literacy
Many prison inmates had extensive interactions with the criminal justice
system. Almost half had been on some combination of probation and
incarceration three or more times (table 5.2). Almost 3 in 10 had been
sentenced to a correctional facility three or more times. About 14 percent
had been on probation three or more times.
Prisoners with no prior criminal justice status had about the same literacy
proficiencies as those with extensive criminal records, that is, those who had
been on probation and/or incarcerated three or more times. First timers
averaged 242 on the prose scale; those with three or more convictions averaged
250 — scores which are not statistically different. In addition, the proficiency
scores of prisoners previously on probation and/or incarcerated one or two
times are not statistically different from those of prisoners who had never been
in a correctional facility before (table 5.2). The scores average in the lower half
of Level 2. The same pattern is apparent for the document and quantitative scales.
For the prose and document scales, the average proficiency scores for
prisoners who had previously been on probation at least three times are higher
than for those who had never been on probation — 17 points higher. It was not
because prisoners with extensive probation records had attended more school.
Inmates on probation three or more times had attended school for about the
same amount of time as those who had never been on probation. Prisoners with
no probation record attended school an average of 11.1 years; those with three
62 . . . . . . Chapter 5

TABLE 5.2
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates, by Number of Times Recidivated
LITERACY SCALES

NUMBER OF TIMES RECIDIVATED
Prose

Document

Quantitative

WGT N
(/1,000)

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

CPCT ( SE )
PROF ( SE )

1,143

763

100 ( 0.0)
246 ( 1.9)

100 ( 0.0)
240 ( 2.2)

100 ( 0.0)
236 ( 3.0)

448

292

1 time

327

223

2 times

192

129

3 or more times

161

108

39
242
30
243
17
250
14
259

(
(
(
(
(
(
(
(

2.0)
3.0)
1.6)
3.5)
1.2)
4.1)
1.3)
5.1)

39
236
30
239
17
242
14
253

(
(
(
(
(
(
(
(

2.0)
3.5)
1.6)
3.9)
1.2)
4.8)
1.3)
5.3)

39
239
30
230
17
235
14
242

(
(
(
(
(
(
(
(

2.0)
4.0)
1.6)
5.4)
1.2)
5.5)
1.3)
5.5)

426

272

1 time

231

155

2 times

148

103

3 or more times

318

219

36
244
21
245
14
251
29
248

(
(
(
(
(
(
(
(

1.7)
3.5)
1.1)
5.1)
1.0)
4.9)
1.7)
3.8)

36
239
21
237
14
249
29
241

(
(
(
(
(
(
(
(

1.7)
4.0)
1.1)
5.4)
1.0)
4.9)
1.7)
4.0)

36
237
21
233
14
243
29
233

(
(
(
(
(
(
(
(

1.7)
4.4)
1.1)
6.0)
1.0)
5.2)
1.7)
4.5)

267

168

1 time

182

123

2 times

173

116

3 or more times

504

344

22
242
16
246
15
241
46
250

(
(
(
(
(
(
(
(

1.5)
4.6)
1.1)
4.4)
1.1)
6.0)
1.9)
2.9)

22
236
16
240
15
235
46
245

(
(
(
(
(
(
(
(

1.5)
5.0)
1.1)
4.9)
1.1)
6.8)
1.9)
3.0)

22
238
16
239
15
229
46
237

(
(
(
(
(
(
(
(

1.5)
5.3)
1.1)
5.3)
1.1)
7.9)
1.9)
4.0)

Juvenile and/or adult . . .
n

Total Population
Total
Probation
None

Incarceration
None

Probation and/or Incarceration
None

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

Chapter 5 . . . . . . 63

or more probation sentences, 10.8 years (see appendix B, table 1). In addition,
prisoners with three or more probations were less likely to be employed than
prisoners without probation sentences. To the contrary, the longer the
probation record, the less likely that prisoners had been employed prior to
their current incarceration: 59.5 percent of prisoners with three or more
probations were employed either full-time or part-time before their current
incarceration, 65.1 percent of those with two previous probations, 67.4 percent
of those with one prior probation, and 72.4 percent of those with no previous
probation sentence (see appendix B, table 2).

Recidivism and Educational Level of Prison Inmates
An estimated 76 percent of inmates with a high school diploma, GED, or
postsecondary education had been on probation or in a correctional facility
prior to being incarcerated for their current offense (table 5.3) (although the
difference does not reach statistical significance); 81 percent of inmates
without a high school diploma or GED were repeat offenders.
The proficiency scores of prisoners with a high school diploma, GED, or
postsecondary education average about the same on the three literacy scales
regardless of whether they were serving their first sentence or had previously
been on probation or in a correctional facility. On all three scales, average
proficiency scores generally are within 10 points of 270 (table 5.3). None of the
differences reaches statistical significance. The proficiency scores of prisoners
with less than a high school diploma also average about the same on all three
scales regardless of recidivism, with scores on average in the low 200s.
Those prisoners who had never finished high school or a GED score
significantly lower than those who had at least completed high school
regardless of criminal histories. While those with less than a high school
education perform, on average, in Level 1, those who completed at least high
school perform, on average, in Level 2.

Recidivism and Race/Ethnicity of Prison Inmates
An estimated 83 percent of Black inmates had previous sentences, compared
with 78 percent of White inmates and 68 percent of those in the other racial/
ethnic group, which comprises primarily Hispanics (84 percent of the other
category), along with Native Americans and Asians (table 5.4). In addition,
70 percent of Black inmates had been in a correctional facility for a previous
offense, compared with 63 percent of White inmates and 54 percent of other
inmates. Black and White inmates, however, were equally as likely to be

64 . . . . . . Chapter 5

TABLE 5.3
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates at Each Education Level Reporting Recidivism
RECIDIVISM

LITERACY SCALES BY
LEVEL OF EDUCATION

n

Prose
0 to 12 years
H.S. diploma/GED or more
Document
0 to 12 years
H.S. diploma/GED or more
Quantitative
0 to 12 years
H.S. diploma/GED or more

WGT N
(/1,000)

538

375

600

385

538

375

600

385

538

375

600

385

None

Probation only

Incarceration
only

Both probation
and
incarceration

RPCT ( SE )
PROF ( SE )

RPCT ( SE )
PROF ( SE )

RPCT ( SE )
PROF ( SE )

RPCT ( SE )
PROF ( SE )

19
211
24
268

(
(
(
(

1.8)
6.6)
2.0)
5.4)

13
213
14
278

(
(
(
(

1.7)
7.3)
1.7)
4.4)

17
213
16
273

(
(
(
(

1.9)
7.5)
1.6)
5.1)

51
229
45
271

(
(
(
(

2.4)
3.6)
2.3)
3.8)

19
201
24
265

(
(
(
(

1.8)
8.3)
2.0)
5.1)

13
211
14
271

(
(
(
(

1.7)
7.8)
1.7)
5.5)

17
208
16
266

(
(
(
(

1.9)
8.3)
1.6)
4.4)

51
223
45
265

(
(
(
(

2.4)
4.2)
2.3)
4.5)

19
204
24
266

(
(
(
(

1.8)
8.0)
2.0)
5.1)

13
199
14
269

(
(
(
(

1.7)
7.8)
1.7)
7.6)

17
214
16
268

(
(
(
(

1.9)
7.7)
1.6)
5.8)

51
211
45
258

(
(
(
(

2.4)
5.7)
2.3)
4.8)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); RPCT = row percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the true
population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

recidivists, and both Black and White inmates were more likely to be recidivists
than other inmates.
Black prisoners demonstrate about the same proficiencies on the literacy
scales regardless of previous criminal record (table 5.4). All Black prisoners
performed, on average, in the lower half of Level 2. White inmates generally
demonstrate a similar pattern, performing, on average, in the upper half of
Level 2, regardless of prior probations or incarcerations.

Chapter 5 . . . . . . 65

66 . . . . . . Chapter 5

The proficiency scores of Black inmates, however, are, on average,
significantly lower than those of White inmates regardless of recidivism.
Particularly on the quantitative scale, the proficiency scores range from 40 to
almost 50 points lower than those of their White counterparts, while on the
prose scale they are from 25 to almost 35 points lower, and on the document
scale they are about 35 to 45 points lower.

Recidivism and Disabilities of Prison Inmates
As shown in chapter 2, prisoners are more likely than the general population to
have a disability — 36 percent of prisoners compared with 26 percent of the
household population have one or more disabilities. Disabilities include visual,
hearing, learning, mental or emotional, physical, or long-term disabilities.
Many of these conditions can lessen a person’s ability to read and compute,
making a person less employable, less able to deal with the demands and
stresses of living in the 20th century, and more likely to end up with a criminal
career.3 According to the data in table 5.5, there appears to be no relationship
between inmates’ criminal careers and the presence or absence of a disability.
Almost 8 in 10 with or without disabilities had been on probation or in a
correctional facility before their current sentence. About two-thirds had
previously been in a jail, prison or juvenile facility — that is, incarcerated only
or both on probation and incarcerated — regardless of the presence or absence
of disabilities. However, due to the small sample size of this study, all
disabilities have been grouped together. It may be that examining selected
disabilities, particularly a learning disability, would yield different results.
Moreover, disabilities for this report are self-reported by the respondent. Some
may be classified as disabled as the result of various examinations but may be
unaware of their status.
Inmates who had never been on probation or in a correctional facility
demonstrate about the same prose, document and quantitative proficiencies,
whether or not they had a disability (table 5.5). Their average proficiency
scores range from 230 to 245 on all three scales. Literacy proficiency does vary
among recidivists, however, depending on the presence or absence of
disabilities. In general, the proficiency scores of recidivists with disabilities are
lower than the scores of recidivists with no disabilities. While the proficiency
3

E. Herrick. (September-October 1988). “The Hidden Handicap in Prison.” Corrections Compendium, 13
(1).

Chapter 5 . . . . . . 67

TABLE 5.5
Percentages and Average Proficiencies on Each Literacy Scale
of Inmates With or Without Disabilities Reporting Recidivism
RECIDIVISM

LITERACY SCALES BY
DISABILITIES

n

Prose
No disability
One or more disabilities
Document
No disability
One or more disabilities
Quantitative
No disability
One or more disabilities

WGT N
(/1,000)

733

483

399

273

733

483

399

273

733

483

399

273

None

Probation only

Incarceration
only

Both probation
and
incarceration

RPCT ( SE )
PROF ( SE )

RPCT ( SE )
PROF ( SE )

RPCT ( SE )
PROF ( SE )

RPCT ( SE )
PROF ( SE )

22
244
23
240

(
(
(
(

2.0)
5.9)
1.9)
5.9)!

15
255
11
230

( 1.5)
( 5.9)
( 1.8)
(10.9)!

16
255
18
223

(
(
(
(

1.6)
4.5)
2.3)
9.2)!

48
256
49
235

(
(
(
(

2.3)
3.5)
3.0)
6.2)

22
238
23
232

(
(
(
(

2.0)
6.7)
1.9)
7.3)!

15
249
11
225

( 1.5)
( 6.1)
( 1.8)
(12.8)!

16
251
18
215

(
(
(
(

1.6)
4.0)
2.3)
9.7)!

48
250
49
231

(
(
(
(

2.3)
4.0)
3.0)
7.2)

22
241
23
232

(
(
(
(

2.0)
7.5)
1.9)
8.4)!

15
241
11
216

( 1.5)
( 7.6)
( 1.8)
(11.9)!

16
253
18
223

(
(
(
(

1.6)
5.6)
2.3)
7.9)!

48
243
49
215

(
(
(
(

2.3)
5.1)
3.0)
8.0)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); RPCT = row percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the true
population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
! Interpret with caution -- the nature of the sample does not allow accurate determination of the variability of this statistic.
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

scores of those with disabilities who had been on probation only and
incarcerated only should be interpreted with caution, the proficiency scores for
those who had been both on probation and incarcerated confirm the pattern —
the scores of those with disabilities are about 20 to 30 points lower on the three
scales than the scores of those with no disabilities.

68 . . . . . . Chapter 5

Summary
About 77 percent of prison inmates are repeat offenders. White and Black
inmates are more likely to be repeat offenders than inmates of other racial/
ethnic groups. But inmates with disabilities are about as likely as those without
disabilities to be repeat offenders.
Because the results of this study focus on a particular point in time and
literacy is assessed only at the time of the study, it has not been possible to
trace the effects of increasing education or literacy proficiency on recidivism.
This study does not provide literacy proficiencies for adults who increased their
skills while in prison but who are no longer incarcerated.
What is clear is that those inmates who came back to the criminal justice
system generally did not differ from first timers with respect to literacy skills.
It is impossible to say if they achieved their current skill levels before entering
prison for their current offense or after their present incarceration.
There are also no differences in proficiency scores with respect to
recidivism when comparisons are made within levels of education and within
racial/ethnic groups. In addition, across categories of recidivism, the
proficiencies of inmates with no disabilities as well as the proficiencies of
inmates with disabilities are about the same.

Chapter 5 . . . . . . 69

CHAPTER 6
Comparing Literacy Practices and Self-Perceptions
of the Prison and Household Populations

Introduction
This report highlights the many ways in which the prison population differs
from the household population. By extending comparisons already drawn
between the household and prison populations, this chapter compares the
literacy practices of the prison and household populations. The literacy
practices discussed include the following:
y reading a variety of materials encountered in daily life
y writing or filling out different materials
y using arithmetic
y reading different types of books
The literacy practices of the prison and the household populations are
compared by the following:
y the frequency of reading and writing various materials and using arithmetic
y

the differences in proficiency scores among those who report frequent,
occasional, and infrequent reading, writing, and use of arithmetic

The frequency with which people use different kinds of printed and written
information may reflect the demands of the particular contexts in which they
function. Furthermore, without adequate opportunity to interact with texts
or documents, individuals may not develop the requisite skills needed to read
such materials.
As a complement to investigating the frequency of certain literacy practices,
this chapter also compares the prison and household populations with respect to
self-perception of their ability to perform certain literacy activities. Related to
that is how frequently adults receive help on various literacy-related tasks.

Chapter 6 . . . . . . 71

Reading Practices
Inmates were asked how often they read or used particular materials: letters or
memos; reports, articles, magazines, or journals; manuals or reference books,
including catalogs or parts lists; directions or instructions; diagrams or
schematics; and bills, invoices, spreadsheets, or budget tables. The household
population was asked two versions of the question: once in connection with
respondents’ personal use and once in connection with their job. Thus, the
tables for the household population present data that are an aggregate of
personal and job use.

Table 6.1
Percentages of Prison and Household Populations
Reporting Frequency of Reading Materials in English
FREQUENCY

MATERIALS BY
POPULATIONS

n

WGT N
(/1,000)

Every day or
a few times
a week

Once a week

Less than
once a week

RPCT (SE)

RPCT (SE)

RPCT (SE)

Letters, Memos
Prison
Household

1,145
24,914

764
190,316

61 (1.5)
46 (0.5)

14 (1.0)
14 (0.3)

25 (1.4)
40 (0.6)

1,145
24,906

764
190,278

49 (1.5)
44 (0.6)

17 (1.3)
19 (0.3)

34 (1.3)
37 (0.5)

1,142
24,889

762
190,104

29 (1.4)
29 (0.5)

15 (1.5)
19 (0.4)

56 (1.9)
51 (0.5)

1,142
24,875

762
190,004

22 (1.4)
40 (0.5)

10 (1.0)
19 (0.4)

69 (1.5)
41 (0.4)

1,142
24,841

762
189,651

14 (1.1)
10 (0.2)

7 (0.8)
7 (0.2)

79 (1.2)
83 (0.3)

1,144
24,874

763
189,886

12 (1.0)
41 (0.5)

8 (1.0)
21 (0.3)

81 (1.5)
38 (0.4)

Reports, Articles
Prison
Household

Manuals, Reference
Prison
Household

Directions, Instructions
Prison
Household

Diagrams
Prison
Household

Bills, Invoices
Prison
Household

n = sample size; WGT N = population size estimate /1,000 (the sample sizes for subpopulations may not add up to the total
sample sizes because of missing data); RPCT = row percentage estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey, 1992.

72 . . . . . . Chapter 6

Table 6.1 shows the percentages of inmates and householders reporting
how often they read certain materials. For both populations, the two types of
most frequently read materials (at least a few times a week) are letters or
memos and reports or articles. Sixty-one percent of the inmates, however,
reported reading letters or memos frequently compared with 46 percent of the
household population. About 50 percent of the inmates reported reading
reports or articles frequently, compared with 44 percent of the householders,
and 14 and 10 percent of inmates and householders, respectively, reported
reading diagrams frequently. Twenty-nine percent of both populations
reported frequent reading of manuals and reference. Almost twice as many
householders as inmates, however, reported reading directions or instructions
frequently, and almost four times as many reported reading bills, invoices,
and other such documents.
Table 6.2 shows the proficiency scores for the prison and household
populations reporting frequency of reading particular materials by selected
scales. The particular scale or scales for each type of material were selected
because the skills that characterize the scale(s) are the ones that are used to
read the type of material. (This also holds true for writing and arithmetic
practices, which are reported later in the chapter.)
As shown in table 6.2, the proficiency scores of inmates who read any of
the materials less than once a week are significantly lower than the scores of
those who read any of the materials frequently (at least a few times a week).
The scores of inmates who read less than once a week are also lower than the
scores of those who read once a week, with the exception of reading directions.
When inmates are compared with householders, the proficiencies of inmates
who read materials frequently are about the same as those of householders who
read the same materials less than once a week. The exception is on the prose
scale for reports, in which case the proficiency of inmates who read frequently
is higher than that of householders who read reports infrequently.

Writing Practices
Inmates were also asked how often they wrote or filled out various kinds of
documents: letters or memos, forms, and reports or articles. The household
population was also asked how often they wrote or filled out these same
materials, but, as with reading practices, they were asked with respect to
personal use and job use. Thus, once again the data for the household population
are an aggregate of their response to these two aspects, personal and job.
As shown in table 6.3, about two-thirds of the inmates and about one-third
of the householders reported that they write letters frequently, that is, every
day or a few times a week. A slightly higher proportion of householders
Chapter 6 . . . . . . 73

TABLE 6.2
Average Proficiencies on Literacy Scales of Prison and
Household Populations Reporting Frequency of Reading Materials in English
FREQUENCY

MATERIALS BY SCALES
BY POPULATIONS

Every day or a few
times a week

Once a week

Less than once a
week

WGT N
(/1,000)

PROF ( SE )

PROF ( SE )

PROF ( SE )

Letters, Memos
Prose
Prison
Household

1,145
764
24,914 190,316

258 ( 2.4)
287 ( 0.8)

240 ( 5.9)
274 ( 1.1)

220 ( 3.9)
256 ( 1.1)

Reports, Articles
Prose
Prison
Household

1,145
764
24,906 190,278

260 ( 2.8)
294 ( 0.7)

255 ( 4.1)
279 ( 1.2)

222 ( 3.3)
245 ( 1.1)

1,142
762
24,889 190,104

264 ( 3.2)
292 ( 0.9)

255 ( 4.2)
285 ( 1.2)

234 ( 2.5)
257 ( 0.9)

1,142
762
24,889 190,104

257 ( 3.3)
286 ( 1.0)

250 ( 4.0)
281 ( 1.3)

229 ( 2.9)
251 ( 0.9)

1,142
762
24,875 190,004

257 ( 3.9)
281 ( 0.9)

254 ( 6.1)
285 ( 1.3)

241 ( 2.5)
259 ( 1.0)

1,142
762
24,875 190,004

249 ( 4.0)
276 ( 0.9)

248 ( 6.0)
277 ( 1.3)

236 ( 2.7)
254 ( 1.0)

1,142
762
24,841 189,651

262 ( 4.3)
291 ( 1.6)

270 ( 4.9)
289 ( 1.9)

234 ( 2.2)
262 ( 0.8)

1,144
763
24,874 189,886

256 ( 4.4)
284 ( 0.8)

256 ( 5.4)
276 ( 1.0)

237 ( 2.3)
244 ( 1.3)

1,144
763
24,874 189,886

254 ( 5.7)
289 ( 0.9)

249 ( 5.9)
282 ( 1.2)

232 ( 3.3)
246 ( 1.4)

n

Manuals, Reference
Prose
Prison
Household
Document
Prison
Household
Directions, Instructions
Prose
Prison
Household
Document
Prison
Household
Diagrams
Document
Prison
Household
Bills, Invoices
Document
Prison
Household
Quantitative
Prison
Household

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); (SE) = standard error of the estimate (the true population value can be said to be within 2 standard errors of the sample
estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

74 . . . . . . Chapter 6

Table 6.3
Percentages of Prison and Household Populations
Reporting Frequency of Writing Materials in English
FREQUENCY

MATERIALS BY
POPULATIONS

n

WGT N
(/1,000)

Every day or
a few times
a week

Once a week

Less than
once a week

RPCT (SE)

RPCT (SE)

RPCT (SE)

Letters, Memos
Prison
Household

1,146
24,907

765
190,257

65 (1.4)
34 (0.5)

13 (1.1)
17 (0.3)

22 (1.2)
48 (0.5)

1,139
24,901

759
190,214

23 (1.3)
28 (0.4)

16 (1.5)
25 (0.3)

62 (1.7)
33 (0.4)

1,139
24,878

757
190,062

15 (1.2)
12 (0.3)

9 (1.1)
10 (0.2)

76 (1.6)
79 (0.4)

Filling Out Forms
Prison
Household

Reports, Articles
Prison
Household

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total
sample sizes because of missing data); RPCT = row percentage estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey, 1992.

(28 percent) than inmates (23 percent) reported filling out forms frequently.
With respect to report writing, a greater percentage of inmates (15 percent) than
householders (12 percent) reported engaging in this practice on a frequent basis.
Although a greater percentage of inmates than householders reported
writing letters or memos frequently, their proficiency scores on the prose scale
are lower than those of householders, 258 and 289 respectively (table 6.4). The
proficiency of inmates who write letters infrequently is 213, while that of
householders who write infrequently is 257, which is about the same as that of
inmates who write frequently. The pattern of scores is similar on the
quantitative scale for filling out forms and on the prose scale for writing
reports: the proficiency of householders who engage in these practices less than
once a week are comparable to that of inmates who engage in them frequently.

Arithmetic Practices
Another practice related to literacy proficiency is the frequency of using
arithmetic. Again, both populations were asked how frequently they used
arithmetic, with the household population responding for both personal and

Chapter 6 . . . . . . 75

TABLE 6.4
Average Proficiencies on Literacy Scales of Prison and
Household Populations Reporting Frequency of Writing Materials in English
FREQUENCY

MATERIALS BY
SCALES BY
POPULATIONS

Every day or a few
times a week

Filling Out Forms
Document
Prison
Household
Quantitative
Prison
Household
Reports, Articles
Prose
Prison
Household

Less than once a week

WGT N
(/1,000)

PROF ( SE )

PROF ( SE )

PROF ( SE )

1,146
765
24,907 190,257

258 ( 2.1)
289 ( 0.8)

242 ( 6.8)
284 ( 1.3)

213 ( 4.7)
257 ( 0.9)

1,139
759
24,901 190,214

260 ( 3.0)
286 ( 1.0)

251 ( 3.7)
283 ( 1.0)

231 ( 3.0)
248 ( 1.1)

1,139
759
24,901 190,214

252 ( 4.4)
291 ( 0.9)

244 ( 4.8)
289 ( 1.1)

229 ( 3.8)
251 ( 1.2)

1,139
757
24,878 190,062

262 ( 4.3)
288 ( 1.5)

264 ( 4.7)
288 ( 1.7)

241 ( 2.4)
268 ( 0.7)

n

Letters, Memos
Prose
Prison
Household

Once a week

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); (SE) = standard error of the estimate (the true population value can be said to be within 2 standard errors of the sample
estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

work situations and their responses aggregated as shown in table 6.5. Sixty-six
percent of the inmates reported using arithmetic frequently (at least a few
times a week) compared with 78 percent of the household population. The
lower percentage of inmates may reflect that they have fewer opportunities or
less need to use arithmetic given their incarceration.
The proficiency scores of inmates who use arithmetic frequently and once
a week are significantly higher than the scores of those who use arithmetic
infrequently — 246 and 233 compared with 203. On the other hand, for the
household population the proficiency of those who use arithmetic frequently
(285) is significantly higher than the proficiency of those who do so once a

76 . . . . . . Chapter 6

TABLE 6.5
Percentages and Average Proficiencies on the Quantitative Scale of Prison
and Household Populations Reporting Frequency of Using Arithmetic
LITERACY SCALES

POPULATIONS

n

Prison
Household

1,143

WGT N
(/1,000)

763

24,883 190,066

Every day or a few
times a week

Once a week

Less than once a week

RPCT ( SE )
PROF ( SE )

RPCT ( SE )
PROF ( SE )

RPCT ( SE )
PROF ( SE )

66
246
78
285

(
(
(
(

1.6)
3.4)
0.5)
0.7)

14
233
9
252

(
(
(
(

1.0)
6.6)
0.3)
2.3)

20
203
13
205

(
(
(
(

1.5)
5.5)
0.4)
2.3)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); RPCT = row percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the true
population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

(285) is significantly higher than the proficiency of those who do so once a
week (252), which in turn is significantly higher than the proficiency of those
who do so infrequently (205). The proficiency scores of householders who use
arithmetic frequently or once a week are significantly higher than the
proficiencies of inmates using arithmetic with the same frequency. There
seems to be no appreciable difference in the proficiencies of inmates and
householders who report infrequent use of arithmetic.

Reading Books
The frequency of literacy practices is one important indicator of how literacy
skills are put to use. Another dimension is the extent to which individuals
report reading book-length materials. Both the prison and household
populations were asked which types of books they had read within the last
six months (table 6.6).
Only 11 percent of the inmates reported not reading any book within the
last six months, compared with 17 percent of the householders. While these
nonreaders from both populations, on average, perform in Level 1, the
proficiency scores of householders are significantly higher than those of
inmates on both scales — 24 points higher on the prose scale and 40 points on
Chapter 6 . . . . . . 77

TABLE 6.6
Percentages and Average Prose and Document Proficiencies
of Prison and Household Populations, by Types of Books Read
LITERACY SCALES

TYPES OF BOOKS READ
BY POPULATIONS

n

Fiction
Prison
Household
Recreation/Entertainment
Prison
Household
Current Affairs/History
Prison
Household
Inspiration/Religion
Prison
Household
Science/Social Science
Prison
Household
Reference
Prison
Household
Manuals
Prison
Household
Other
Prison
Household
None
Prison
Household

1,142

WGT N
(/1,000)

766

24,901 190,524
1,142

766

24,901 190,524
1,142

766

24,901 190,524
1,142

766

24,901 190,524
1,142

766

24,901 190,524
1,142

766

24,901 190,524
1,142

766

24,901 190,524
1,142

766

24,901 190,524
1,142

766

24,901 190,524

Prose

Document

PCT ( SE )
PROF ( SE )

PCT ( SE )
PROF ( SE )

61
263
49
298

(
(
(
(

1.5)
2.4)
0.4)
0.7)

61
258
49
290

(
(
(
(

1.5)
2.5)
0.4)
0.7)

39
263
31
292

(
(
(
(

1.5)
2.4)
0.5)
1.0)

39
255
31
286

(
(
(
(

1.5)
2.9)
0.5)
1.1)

44
262
32
293

(
(
(
(

1.6)
2.7)
0.5)
0.9)

44
256
32
286

(
(
(
(

1.6)
3.1)
0.5)
0.9)

53
251
36
279

(
(
(
(

2.1)
2.9)
0.5)
1.1)

53
245
36
270

(
(
(
(

2.1)
3.0)
0.5)
1.2)

30
265
23
301

(
(
(
(

1.6)
4.0)
0.4)
1.0)

30
256
23
295

(
(
(
(

1.6)
4.2)
0.4)
1.2)

58
260
54
294

(
(
(
(

2.0)
2.6)
0.6)
0.7)

58
255
54
288

(
(
(
(

2.0)
2.6)
0.6)
0.7)

33
262
55
291

(
(
(
(

1.6)
2.5)
0.4)
0.6)

33
259
55
285

(
(
(
(

1.6)
2.8)
0.4)
0.8)

17
270
19
301

(
(
(
(

1.5)
4.8)
0.5)
1.0)

17
263
19
295

(
(
(
(

1.5)
5.4)
0.5)
1.1)

11
189
17
213

(
(
(
(

0.9)
6.5)
0.4)
1.9)

11
169
17
209

(
(
(
(

0.9)
8.9)
0.4)
2.2)

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); PCT = percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the true
population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

78 . . . . . . Chapter 6

the document scale. Inmates who reported not having read any books
demonstrate such low skills that they would have trouble with anything other
than very simple texts or very explicit locate tasks.
The types of books most frequently read by inmates are fiction, reference,
and inspiration and religion, whereas householders reported reading manuals,
reference, and fiction the most. This difference in reading habits may reflect
that the household population is more likely to be in situations where they are
required to read these types of materials. As expected, given the data for other
literacy practices, the proficiencies of inmates who reported reading the various
types of books are lower than those of householders.
Another way to compare the prison population with the household is to
look at the relationship between literacy level and book reading practice. Table
6.7 shows the percentages of prisoners and householders at each proficiency
level who reported reading and not reading books within the last six months. At
each proficiency level, a greater percentage of inmates than householders
reported reading books. The difference is particularly noticeable at Levels 1
and 2. For example, on the prose scale 76 percent of inmates at Level 1
reported reading books compared with 58 percent of householders, and
93 percent of inmates at Level 2 reported reading books compared with
83 percent of householders. Because such high percentages of prisoners at all
levels reported reading books, one pattern that is apparent for the household
population is not apparent for the prison population: that is, a step-by step
increase in the percentage who reported reading books as the literacy level
increases. On both scales, the percentage of householders at Levels 4 and 5
who reported reading books is greater than the percentage at Level 3; the
percentage at Level 3 is greater than that at Level 2, which, in turn, is greater
than the percentage at Level 1. For the prison population, however, on the
prose scale only the percentage at Level 2 is greater than the percentage at
Level 1; after that the percentages level off. On the document scale, there are
no differences starting with Level 3 and above.

Self-Perceptions of Ability to Perform Literacy Activities
Both the prison and the household populations were asked how well they read
and write English and do arithmetic problems when they have to get the
numbers from materials written in English. When compared with the
household population, a lower percentage of inmates than householders
reported doing the activities very well, while a higher percentage reported they
do not perform the activities well. As shown in table 6.8, over one-half the
prisoners said they read and write very well in English and 40 percent said they
do arithmetic very well. In comparison, about 70 percent of householders said

Chapter 6 . . . . . . 79

TABLE 6.7
Percentages within Levels and Average Proficiencies on Prose and Document
Scales of Prison and Household Populations Reporting Reading Books
LITERACY LEVELS AND AVERAGE PROFICIENCY

LITERACY SCALES
BY READING
BOOKS BY
POPULATIONS

Level 1

Document
Prison
Read books
Read no books
Household
Read books
Read no books

Level 3

Levels 4 & 5

Average
proficiency

WGT N
(/1,000)

CPCT ( SE )

CPCT ( SE )

CPCT ( SE )

CPCT ( SE )

PROF ( SE )

679
83

76 ( 2.0)
24 ( 6.1)

93 ( 1.9)
7 ( 5.0)

96 ( 1.9)
4 ( 3.0)

100 ( 1.0)
1 ( 0.8)

253 ( 2.1)
189 ( 6.5)

21,095 158,605
3,806 31,576

58 ( 0.8)
42 ( 1.4)

83 ( 1.0)
17 ( 1.2)

92 ( 0.7)
8 ( 1.0)

97 ( 0.5)
3 ( 0.5)

285 ( 0.6)
213 ( 1.9)

679
83

78 ( 1.9)
22 ( 5.5)

92 ( 1.8)
8 ( 4.5)

97 ( 1.5)
3 ( 2.7)

100 ( 0.9)
0†( 0.6)

249 ( 2.3)
169 ( 8.9)

21,095 158,605
3,806 31,576

62 ( 0.8)
38 ( 1.4)

84 ( 0.5)
16 ( 1.2)

92 ( 0.5)
8 ( 1.0)

96 ( 0.5)
4 ( 0.4)

279 ( 0.7)
209 ( 2.2)

n

Prose
Prison
Read books
Read no books
Household
Read books
Read no books

Level 2

1,023
119

1,023
119

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); CPCT = column percentage estimate; PROF = average proficiency estimate; (SE) = standard error of the estimate (the
true population value can be said to be within 2 standard errors of the sample estimate with 95% certainty).
† Percentages less than 0.5 are rounded to 0.
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

80 . . . . . . Chapter 6

they read English very well, 64 percent said they write very well, and 53
percent said they do arithmetic very well. Ten, 13, and 17 percent of prisoners
said they do not read, write, or do arithmetic well, respectively. On the other
hand, between 5 and 10 percent of householders said they do not perform
these activities well.
Both prisoners and householders who said they read, write, and do
arithmetic very well demonstrate higher proficiencies than those who said they
do so well, who in turn demonstrate higher proficiencies than those who said
they do not read, write, or do arithmetic well (table 6.9). On the other hand,
prisoners who said they do these activities very well or well demonstrate lower
proficiencies than their counterparts in the household population. The exception
is on the document scale for those who reported reading well: inmates
demonstrate about the same proficiency as householders. Furthermore,
inmates who said they write and do arithmetic very well demonstrate about the
same proficiencies as householders who reported doing these activities well.
For example, the average prose proficiency of inmates who reported writing

Table 6.8
Percentages of Prison and Household Populations Reporting
Self-Perceptions of Ability to Perform Literacy Activities
SELF-PERCEPTION

ACTIVITIES BY
POPULATIONS

Very well

Well

Not well

Not at all

n

WGT N
(/1,000)

RPCT (SE)

RPCT (SE)

RPCT (SE)

RPCT (SE)

1,144
24,897

763
190,164

57 (1.8)
71 (0.7)

31 (1.8)
22 (0.6)

10 (0.8)
5 (0.2)

2 (0.4)
2 (0.1)

1,144
24,855

764
189,884

52 (1.7)
64 (0.8)

33 (1.9)
27 (0.7)

13 (1.2)
7 (0.2)

3 (0.4)
3 (0.1)

1,145
24,916

765
190,259

40 (1.9)
53 (0.8)

41 (1.9)
35 (0.7)

17 (1.5)
9 (0.3)

2 (0.3)
3 (0.1)

Read English
Prison
Household

Write English
Prison
Household

Do Arithmetic
Prison
Household

n = sample size; WGT N = population size estimate /1,000 (the sample sizes for subpopulations may not add up to the total sample
sizes because of missing data); RPCT = row percentage estimate; (SE) = standard error of the estimate (the true population value
can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey, 1992.

Chapter 6 . . . . . . 81

TABLE 6.9
Average Proficiencies of Prison and Household Populations
Reporting Self-Perceptions of Ability to Perform Literacy Activities
SELF- PERCEPTION

ACTIVITIES BY
SCALES BY
POPULATIONS

Very well

Write English
Prose
Prison
Household
Document
Prison
Household
Do Arithmetic
Quantitative
Prison
Household

Not well

Not at all

WGT N
(/1,000)

PROF ( SE )

PROF ( SE )

PROF ( SE )

PROF ( SE )

1,144
763
24,897 190,164

267 ( 2.3)
291 ( 0.6)

241 ( 3.2)
253 ( 1.2)

165 ( 6.9)
163 ( 3.3)

*** ( ****)
119 ( 3.3)

1,144
763
24,897 190,164

262 ( 2.5)
284 ( 0.6)

240 ( 3.9)
249 ( 1.4)

164 ( 6.4)
168 ( 3.4)

*** ( ****)
112 ( 3.5)

1,144
764
24,855 189,884

266 ( 2.6)
292 ( 0.6)

244 ( 3.1)
262 ( 1.2)

192 ( 6.0)
193 ( 2.8)

*** ( ****)
124 ( 2.8)

1,144
764
24,855 189,884

261 ( 2.6)
285 ( 0.7)

241 ( 3.4)
257 ( 1.3)

191 ( 6.9)
196 ( 2.8)

*** ( ****)
119 ( 2.9)

1,145
765
24,916 190,259

264 ( 3.6)
295 ( 0.8)

231 ( 3.2)
264 ( 1.1)

195 ( 5.5)
208 ( 2.3)

*** ( ****)
117 ( 3.5)

n

Read English
Prose
Prison
Household
Document
Prison
Household

Well

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); PROF = average proficiency estimate; (SE) = standard error of the estimate (the true population value can be said to
be within 2 standard errors of the sample estimate with 95% certainty).
*** Sample size is insufficient to permit a reliable estimate (fewer than 45 respondents).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

very well is 266, and the prose proficiency of householders who reported writing
well is 262. On the other hand, prisoners who reported not doing these activities
well perform about the same as their counterparts in the household population.
While between 80 and 90 percent of inmates reported they do these
activities very well, about 50 percent of prisoners are involved in education
or education and vocational programs in prison. (See chapter 4.) This
participation rate may be more a reflection of their demonstrated skills rather

82 . . . . . . Chapter 6

than of a self-perceived need to improve their skills. Even those inmates who
said they read, write, and do arithmetic very well or well perform, on average,
in Level 2.

Collaboration
Another way to determine how adults view their ability to read and write is to
ask how often they receive help on various literacy-related tasks. Both inmates
and householders were asked how frequently they get help with the following
tasks:
y filling out forms
y reading newspaper articles or other written information
y

reading printed information associated with government agencies, public
companies, private business, hospitals, etc.

y writing notes and letters
y using basic arithmetic, such as when filling out forms
As shown in table 6.10, about the same percentage of inmates reported
getting no help with writing notes and using arithmetic (about 80 percent),
while a lower percentage (about 60 percent) reported getting no help with
filling out forms and reading printed information. When compared with the
household population, more inmates than householders reported getting no
help with all tasks except using arithmetic. Generally, more householders than
inmates reported getting some or a little help for all tasks except using
arithmetic. What we cannot know from these data is whether inmates do not
think they need help or whether the prison environment is not conducive to
their seeking the help they need.
As shown in table 6.11, those in both populations who reported getting a
lot of help consistently demonstrate average proficiencies in the range of
Level 1, regardless of the task. Even so, the proficiency scores of prisoners who
get a lot of help are significantly lower than those of their counterparts in the
household population for all tasks on all scales, with the exception of writing
notes. These inmates truly seem to need help since their proficiency scores
indicate that they are able to perform, on average, only the simplest of tasks
in Level 1.

Chapter 6 . . . . . . 83

Table 6.10
Percentages of Prison and Household Populations Reporting
Frequency of Getting Help With Various Tasks
FREQUENCY

TASKS BY
POPULATIONS

A lot

Some

A little

None

n

WGT N
(/1,000)

RPCT (SE)

RPCT (SE)

RPCT (SE)

RPCT (SE)

1,143
24,914

762
190,334

10 (0.8)
12 (0.3)

12 (1.2)
18 (0.4)

18 (1.3)
21 (0.4)

60 (2.1)
49 (0.6)

1,143
24,910

762
190,290

7 (0.7)
5 (0.2)

8 (0.8)
11 (0.4)

12 (1.1)
17 (0.3)

74 (1.7)
67 (0.5)

1,139
24,881

758
190,096

11 (1.1)
9 (0.3)

11 (1.1)
16 (0.4)

15 (1.3)
23 (0.4)

63 (2.2)
52 (0.6)

1,140
24,875

760
190,055

7 (0.9)
5 (0.2)

6 (0.6)
7 (0.2)

8 (0.8)
11 (0.3)

80 (1.3)
76 (0.4)

1,143
24,911

762
190,292

6 (1.0)
5 (0.2)

5 (0.8)
6 (0.2)

10 (1.1)
9 (0.2)

78 (1.8)
80 (0.4)

Filling Out Forms
Prison
Household

Reading Newspapers
Prison
Household

Reading Printed
Information
Prison
Household

Writing Notes, Letters
Prison
Household

Using Arithmetic
Prison
Household

n = sample size; WGT N = population size estimate /1,000 (the sample sizes for subpopulations may not add up to the total sample
sizes because of missing data); RPCT = row percentage estimate; (SE) = standard error of the estimate (the true population value
can be said to be within 2 standard errors of the sample estimate with 95% certainty).
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey, 1992.

For each task except reading printed information, there are significant
differences in the proficiency scores of householders between each category
of frequency. That is, those who get some help attain higher scores than those
who get a lot, those who get a little help attain higher scores than those who get
some, and those who get no help attain higher scores than those who get a
little. For almost every task, the average proficiency of those householders
who get no help is around 280, or in Level 3. This step-by-step increase in
proficiency scores is not evident, however, for the prison population. For all
tasks, the scores of inmates who get some help are significantly higher than the
scores of those who get a lot. Only for reading printed information and using
arithmetic, however, are the scores of those who get a little help significantly
higher than the scores of those who get some help. Finally, the scores of
inmates who get no help are significantly higher than the scores of those who
get a little help, except for reading printed information. In contrast with the
household population, those inmates who get no help perform, on average, at

84 . . . . . . Chapter 6

TABLE 6.11
Average Proficiencies on Literacy Scales of Prison and Household
Populations, by Frequency of Help Received for Various Tasks
FREQUENCY

TASKS BY SCALES BY
POPULATIONS

A lot

Some

A little

None

WGT N
(/1,000)

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

RPCT ( SE )

Filling Out Forms
Document
Prison
Household

1,143
762
24,914 190,334

165 ( 8.4)
217 ( 2.0)

236 ( 6.7)
258 ( 1.4)

234 ( 4.1)
274 ( 1.3)

257 ( 2.3)
279 ( 0.8)

Reading Newspapers
Prose
Prison
Household

1,143
762
24,910 190,290

168 ( 7.3)
184 ( 3.5)

220 ( 8.3)
243 ( 1.8)

234 ( 5.4)
274 ( 1.3)

259 ( 1.9)
284 ( 0.7)

1,139
758
24,881 190,096

195 ( 6.7)
210 ( 2.5)

239 ( 5.9)
263 ( 1.7)

262 ( 4.2)
283 ( 1.2)

253 ( 1.9)
281 ( 0.8)

1,139
758
24,881 190,096

172 ( 7.6)
206 ( 2.3)

239 ( 5.6)
259 ( 1.7)

256 ( 4.0)
279 ( 1.0)

250 ( 2.3)
274 ( 0.8)

Writing Notes, Letters
Prose
Prison
Household

1,140
760
24,875 190,055

169 ( 8.9)
182 ( 3.4)

223 ( 7.7)
242 ( 2.5)

216 ( 7.5)
270 ( 1.6)

257 ( 1.7)
282 ( 0.6)

Doing Arithmetic
Quantitative
Prison
Household

1,143
762
24,911 190,292

146 (12.7)
181 ( 3.2)

190 ( 9.4)
226 ( 3.0)

225 ( 7.1)
254 ( 1.6)

247 ( 3.1)
282 ( 0.8)

n

Reading Printed Information
Prose
Prison
Household
Document
Prison
Household

n = sample size; WGT N = population size estimate / 1,000 (the sample sizes for subpopulations may not add up to the total sample sizes,
due to missing data); RPCT = row percentage estimate; (SE) = standard error of the estimate (the true population value can be said to be
within 2 standard errors of the sample estimate with 95% certainty).
Source: Educational Testing Service, National Adult Literacy Survey, 1992.

about the mid-range of Level 2, around 250, which is to be expected, given that
the average proficiency scores of the inmate population on all three scales fall
in Level 2.

Chapter 6 . . . . . . 85

Summary
Various literacy practices were compared for the prison and household populations. The
most frequently read materials reported by inmates are letters and reports or articles.
One kind of material that more inmates than householders reported reading frequently
is letters. The proficiency scores of inmates who reported reading any material
frequently are significantly higher than the scores of those who reported reading less
than once a week. When compared with the proficiencies of householders, however, the
proficiencies of inmates who read any materials frequently are not only lower than those
of householders who read the same materials frequently, but are also comparable, in
some cases, to the proficiencies of householders who read the materials less than once a
week.
As is the case with reading letters, more inmates than householders reported
writing letters on a frequent basis; however, the proficiency scores of these inmates are
significantly lower than those of the householders. Once again, the proficiency scores of
inmates who perform a particular writing task frequently tend to be comparable with the
scores of householders who do the same task less than once a week.
More householders than inmates reported using arithmetic on a frequent basis. As
with other literacy practices, those in both populations who frequently use arithmetic
perform at higher levels than those who use arithmetic less than once a week. In
addition, the proficiencies of householders generally are higher than those of inmates
except for the proficiencies of those reporting infrequent use of arithmetic.
Significantly fewer inmates than householders reported not reading a book within
the last six months. Inmates reported reading fiction, reference, and inspiration and
religious books more than other types of books, whereas householders reported reading
manuals, reference, and fiction more. As expected, inmates demonstrate lower
proficiencies than householders when compared by type of books read. On the other
hand, greater percentages of inmates at each literacy level reported reading books as
compared with householders.
Significantly fewer inmates than householders reported that they read or write
English very well or do arithmetic very well, while a greater percentage of inmates than
householders said they did not do these activities well. Inmates who reported
performing these activities very well or well demonstrate lower proficiencies than their
counterparts in the household population. Inmates who said they did not perform these
activities well, however, perform about the same as householders who also reported not
doing these activities well.
For the prison population the areas of greatest need as reflected in the frequency
of getting help seem to be filling out forms and reading printed information. Regardless
of the task, those who reported getting more help generally have lower proficiency
scores. Those needing help apparently seek assistance, while those with relatively higher
skills operate more independently.

86 . . . . . . Chapter 6

APPENDIX A
Interpreting the Literacy Scales*

B

uilding on the two earlier literacy surveys conducted by Educational
Testing Service (ETS), the performance results from the National Adult
Literacy Survey are reported on three literacy scales — prose, document, and
quantitative — rather than on a single conglomerate scale. Each of the three
literacy scales ranges from 0 to 500.
The purpose of this section of the report is to give meaning to the literacy
scales — or, more specifically, to interpret the numerical scores that are used to
represent adults’ proficiencies on these scales. Toward this end, the section
begins with a brief summary of the task development process and of the way
in which the literacy levels are defined. A detailed description of the prose,
document, and quantitative scales is then provided. The five levels on each
scale are defined, and the skills and strategies needed to successfully perform
the tasks in each level are discussed. Sample tasks are presented to illustrate
the types of materials and task demands that characterize the levels on each
scale. The section ends with a brief summary of the probabilities of successful
performance on tasks within each level for individuals who demonstrated
different proficiencies.

Building the Literacy Tasks
The literacy scales make it possible not only to summarize the literacy
proficiencies of the total population and of various subpopulations, but also to
determine the relative difficulty of the literacy tasks administered in the survey.
That is, just as an individual receives a score according to his or her
performance on the assessment tasks, each task receives a value according to its
difficulty as determined by the performance of the adults who participated in
the survey. Previous research conducted at ETS has shown that the difficulty of
*This chapter originally appeared in the first report on the National Adult Literacy Survey, I. S. Kirsch,
A. Jungeblut, L. Jenkins, and A. Kolstad. (September 1993). Adult Literacy In America: A First Look at the
Results of the National Adult Literacy Survey. Washington, DC: US Department of Education.

Appendix A . . . . . . 87

a literacy task, and therefore its placement on a particular literacy scale, is
determined by three factors: the structure or linguistic format of the material,
the content and/or the context from which it is selected, and the nature of the
task, or what the individual is asked to do with the material.
Materials. The materials selected for inclusion in NALS reflect a variety of
linguistic formats that adults encounter in their daily activities. Most of the
prose materials used in the survey are expository — that is, they describe,
define, or inform — since most of the prose that adults read is expository in
nature; however, narratives and poetry are included, as well. The prose
materials include an array of linguistic structures, ranging from texts that are
highly organized both topically and visually to those that are loosely organized.
They also include texts of varying lengths, from multiple-page magazine
selections to short newspaper articles. All prose materials included in the
survey were reproduced in their original format.
The document materials represent a wide variety of structures, which are
characterized as tables, charts and graphs, forms, and maps, among other
categories. Tables include matrix documents in which information is arrayed in
rows and columns for example, bus or airplane schedules, lists, or tables of
numbers. Documents categorized as charts and graphs include pie charts, bar
graphs, and line graphs. Forms are documents that require information to be
filled in, while other structures include such materials as advertisements and
coupons.
The quantitative tasks require the reader to perform arithmetic operations
using numbers that are embedded in print. Since there are no materials that
are unique to quantitative tasks, these tasks were based on prose materials and
documents. Most quantitative tasks were, in fact, based on document structures.
Content and/or Contexts. Adults do not read printed or written materials
in a vacuum. Rather, they read within a particular context or for a particular
purpose. Accordingly, the NALS materials represent a variety of contexts and
contents. Six such areas were identified: home and family; health and safety;
community and citizenship; consumer economics; work; and leisure and
recreation.
In selecting materials to represent these areas, efforts were made to
include as broad a range as possible, as well as to select universally relevant
contexts and contents. This was to ensure that the materials would not be so
specialized as to be familiar only to certain groups. In this way, disadvantages
for individuals with limited background knowledge were minimized.
Types of Tasks. After the materials were selected, tasks were developed to
accompany the materials. These tasks were designed to simulate the ways in
which people use various types of materials and to require different strategies

88 . . . . . . Appendix A

for successful task completion. For both the prose and document scales, the
tasks can be organized into three major categories: locating, integrating, and
generating information. In the locating tasks, readers are asked to match
information that is given in a question or directive with either literal or
synonymous information in the text or document. Integrating tasks require the
reader to incorporate two or more pieces of information located in different
parts of the text or document. Generating tasks require readers not only to
process information located in different parts of the material, but also to go
beyond that information by drawing on their knowledge about a subject or by
making broad text-based inferences.
Quantitative tasks require readers to perform arithmetic operations —
addition, subtraction, multiplication, or division — either singly or in is
combination. In some tasks, the type of operation that must be performed is
obvious from the wording of the question, while in other tasks the readers must
infer which operation is to be performed. Similarly, the numbers that are
required to perform the operation can, in some cases, be easily identified,
while in others, the numbers that are needed are embedded in text. Moreover,
some quantitative tasks require the reader to explain how the problem would a
be solved rather than perform the calculation, and on some tasks the use of a
simple four-function calculator is required.

Defining the Literacy Levels
The relative difficulty of the assessment tasks reflects the interactions among
the various task characteristics described here. As shown in Figure 1 in the
Introduction to this report, the score point assigned to each task is the point at
which the individuals with that proficiency score have a high probability of
responding correctly. In this survey, an 80 percent probability of correct
response was the criterion used. While some tasks were at the very low end
of the scale and some at the very high end, most had difficulty values in the
200 to 400 range.
By assigning scale values to both the individuals and tasks, it is possible to
see how well adults with varying proficiencies performed on tasks of varying
difficulty. While individuals with low proficiency tend to perform well on tasks
with difficulty values equivalent to or below their level of proficiency, they are
less likely to succeed on tasks with higher difficulty values. This does not mean
that individuals with low proficiency can never succeed on more difficult
literacy tasks that is, on tasks whose difficulty values are higher than their
proficiencies. They may do so some of the time. Rather, it means that their

Appendix A . . . . . . 89

probability of success is not as high. In other words, the more difficult the task
relative to their proficiency, the lower their likelihood of responding correctly.
The response probabilities for two tasks on the prose scale are displayed in
Figure 3.1. The difficulty of the first task is measured at the 250 point on the
scale, and the second task is at the 350 point. This means that an individual
would have to score at the 250 point on the prose scale to have an 80 percent
chance (that is, a .8 probability) of responding correctly to Task 1. Adults
scoring at the 200 point on the prose scale have only a 40 percent chance of
responding correctly to this task, whereas those scoring at the 300 point and
above would be expected to rarely miss this task and others like it.
In contrast, an individual would need to score at the 350 point to have an
80 percent chance of responding correctly to Task 2. While individuals
performing at the 250 point would have an 80 percent chance of success on the
first task, their probability of answering the more difficult second task correctly
A.1
Figure 3.1

NALS

Probability of Successful Performance

Probabilities of Successful Performance on Two Prose Tasks by Individuals at
Selected Points on the Prose Scale
1.0
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.0
150

200

250

300

350

400

Adults' Average Prose Proficiency
Task 1
Task 2
Source: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey, 1992.

is only 20 percent. An individual scoring at the 300 point is likely to succeed on
this more difficult task only half the time.
An analogy may help clarify the information presented for the two prose
tasks. The relationship between task difficulty and individual proficiency is
much like the high jump event in track and field, in which an athlete tries to
jump over a bar that is placed at increasing heights. Each high jumper has a

height at which he or she is proficient. That is, he or she is able to clear the bar
at that height with a high probability of success, and can clear the bar at lower
levels almost every time. When the bar is higher than their level of proficiency,
however, they can be expected to have a much lower chance of clearing it successfully.
Once the literacy tasks are placed on their respective scales, using the
criterion described here, it is possible to see how well the interactions among
the task characteristics explain the placement of various tasks along the scales.1
In investigating the progression of task characteristics across the scales, certain
questions are of interest. Do tasks with similar difficulty values (that is, with
difficulty values near one another on a scale) have certain shared
characteristics? Do these characteristics differ in systematic ways from tasks in
either higher or lower levels of difficulty? Analyses of the interactions between
the materials read and the tasks based on these materials reveal that an ordered
set of information-processing skills appears to be called into play to perform
the range of tasks along each scale.
To capture this ordering, each scale was divided into five levels that reflect
the progression of information-processing skills and strategies: Level 1 (0 to 225),
Level 2 (226 to 275), Level 3 (276 to 325), Level 4 (326 to 375), and Level 5
(376 to 500). These levels were determined not as a result of any statistical
property of the scales, but rather as a result of shifts in the skills and strategies
required to succeed on various tasks along the scales, from simple to complex.
The remaining pages of this section describe each scale in terms of the
nature of the task demands at each of the five levels. After a brief introduction
to each scale, sample tasks in each level are presented and the factors
contributing to their difficulty are discussed. The aim of these discussions is to
give meaning to the scales and to facilitate interpretation of the results
provided in the first and second sections of this report.

Interpreting the Literacy Levels
Prose Literacy
The ability to understand and use information contained in various kinds of
textual material is an important aspect of literacy. Most of the prose materials
administered in this assessment were expository — that is, they inform, define,
or describe — since these constitute much of the prose that adults read. Some
narrative texts and poems were included, as well. The prose materials were
drawn from newspapers, magazines, books, brochures, and pamphlets and
reprinted in their entirety, using the typography and layout of the original
1

I.S. Kirsch and P.B. Mosenthal. (1990). “Exploring Document Literacy: Variables Underlying the
Performance of Young Adults.” Reading Research Quarterly, 25. pp. 5-30.

Appendix A . . . . . . 91

source. As a result, the materials vary widely in length, density of information,
and the use of structural or organizational aids such as section or paragraph
headings, italic or bold face type, and bullets.
Each prose selection was accompanied by one or more questions or
directives which asked the reader to perform specific tasks. These tasks
represent three major aspects of information-processing: locating, integrating,
and generating. Locating tasks require the reader to find information in the
text based on conditions or features specified in the question or directive. The
match may be literal or synonymous, or the reader may need to make a textbased inference in order to perform the task successfully. Integrating tasks ask
the reader to compare or contrast two or more pieces of information from the
text. In some cases the information can be found in a single paragraph, while in
others it appears in different paragraphs or sections. In the generating tasks,
readers must produce a written response by making text-based inferences or
drawing on their own background knowledge.
In all, the prose literacy scale includes 41 tasks with difficulty values
ranging from 149 to 468. It is important to remember that the locating,
generating, and integrating tasks extend over a range of difficulty as a result of
interactions with other variables including:
y the number of categories or features of information that the reader must
process
y the number of categories or features of information in the text that can
distract the reader, or that may seem plausible but are incorrect
y the degree to which information given in the question is obviously related to
the information contained in the text
y

the length and density of the text

The five levels of prose literacy are defined, and sample tasks provided, in
the following pages.
Prose Level 1

Scale range: 0 to 225

Most of the tasks in this level require the reader to read relatively
short text to locate a single piece of information which is identical to
or synonymous with the information given in the question or
directive. If plausible but incorrect information is present in the text,
it tends not to be located near the correct information.
Average difficulty value of tasks in this level: 198
Percentage of adults performing in this level: 21%

92 . . . . . . Appendix A

Tasks in this level require the reader to locate and match a single piece of
information in the text. Typically the match between the question or directive
and the text is literal, although sometimes synonymous matches may be
necessary. The text is usually brief or has organizational aids such as paragraph
headings or italics that suggest where in the text the reader should search for
the specified information. The word or phrase to be matched appears only
once in the text.
One task in Level 1 with a difficulty value of 208 asks respondents to read
a newspaper article about a marathon swimmer and to underline the sentence
that tells what she ate during a swim. Only one reference to food is contained
in the passage, and it does not use the word “ate.” Rather, the article says the
swimmer “kept up her strength with banana and honey sandwiches, hot
chocolate, lots of water and granola bars.” The reader must match the word
“ate” in the directive with the only reference to foods in the article.
1234567890123456789012
1234567890123456789012
1234567890123456789012
1234567890123456789012

Underline the sentence that tells what Ms. Chanin
ate during the swim.

Swimmer completes
Manhattan marathon
The Associated Press
NEW YORK—University of Maryland
senior Stacy Chanin on Wednesday became
the first person to swim three 28-mile laps
around Manhattan.
Chanin, 23, of Virginia, climbed out of
the East River at 96th Street at 9:30 p.m.
She began the swim at noon on Tuesday.
A spokesman for the swimmer, Roy
Brunett, said Chanin had kept up her
strength with “banana and honey”
sandwiches, hot chocolate, lots of water
and granola bars.”

Chanin has twice circled Manhattan
before and trained for the new feat by
swimming about 28.4 miles a week. The
Yonkers native has competed as a swimmer
since she was 15 and hoped to persuade
Olympic authorities to add a long-distance
swimming event.
The Leukemia Society of America
solicited pledges for each mile she swam.
In July 1983, Julie Ridge became the
first person to swim around Manhattan
twice. With her three laps, Chanin came
up just short of Diana Nyad’s distance
record, set on a Florida-to-Cuba swim.

1234567890123456789012
1234567890123456789012
1234567890123456789012
1234567890123456789012

Reduced from original copy.

Appendix A . . . . . . 93

Prose Level 2

Scale range: 226 to 275

Some tasks in this level require readers to locate a single piece of
information in the text; however, several distractors or plausible but
incorrect pieces of information may be present, or low-level inferences
may be required. Other tasks require the reader to integrate two or
more pieces of information or to compare and contrast easily
identifiable information based on a criterion provided in the question
or directive.
Average difficulty value of tasks in this level: 259
Percentage of adults performing in this level: 27%

Like the tasks in Level 1, most of the tasks in this level ask the reader to
locate information. However, these tasks place more varied demands on the
reader. For example, they frequently require readers to match more than a
single piece of information in the text and to discount information that only
partially satisfies the question. If plausible but incomplete information is
included in the text, such distractors do not appear near the sentence or
paragraph that contains the correct answer. For example, a task based on the
sports article reproduced earlier asks the reader to identify the age at which the
marathon swimmer began to swim competitively. The article first provides the
swimmer’s current age of 23, which is a plausible but incorrect answer. The
correct information, age 15, is found toward the end of the article.
In addition to directing the reader to locate more than a single piece of
information in the text, low-level inferences based on the text may be required
to respond correctly. Other tasks in Level 2 (226 to 275) require the reader to
identify information that matches a given criterion. For example, in one task
with a difficulty value of 275, readers were asked to identify specifically what
was wrong with an appliance by choosing the most appropriate of four
statements describing its malfunction.

94 . . . . . . Appendix A

1234567890123456789012
1234567890123456789012
1234567890123456789012

A manufacturing company provides its customers with the following instructions for returning appliances for service:

When returning appliance for servicing, include a note telling as clearly and
as specifically as possible what is wrong with the appliance.

A repair person for the company receives four appliances with the
following notes attached. Circle the letter next to the note which
best follows the instructions supplied by the company.

A

The clock does not run
correctly on this clock
radio. I tried fixing it, but
I couldn’t.

B

My clock radio is not working. It
stopped working right after I used
it for five days.

C

The alarm on my clock
radio doesn’t go off at the
time I set. It rings 15-30
minutes later.

D

This radio is broken. Please
repair and return by United
Parcel Service to the address on
my slip.

1234567890123456789012
1234567890123456789012
1234567890123456789012

Readers in this level may also be asked to infer a recurring theme. One
task with a difficulty value of 262 asks respondents to read a poem that uses
several metaphors to represent a single, familiar concept and to identify its
theme. The repetitiveness and familiarity of the allusions appear to make this
“generating” task relatively easy.

Appendix A . . . . . . 95

Prose Level 3

Scale range: 276 to 325

Tasks in this level tend to require readers to make literal or
synonymous matches between the text and information given in the
task, or to make matches that require low-level inferences. Other tasks
ask readers to integrate information from dense or lengthy text that
contains no organizational aids such as headings. Readers may also
be asked to generate a response based on information that can be
easily identified in the text. Distracting information is present, but is
not located near the correct information.
Average difficulty value of tasks in this level: 298
Percentage of adults performing in this level: 32%

One of the easier Level 3 tasks requires the reader to write a brief letter
explaining that an error has been made on a credit card bill. This task is at 280
on the prose scale. Other tasks in this level require the reader to search fairly
dense text for information. Some of the tasks ask respondents to make a literal
or synonymous match on more than a single feature, while other tasks ask them
to integrate multiple pieces of information from a long passage that does not
contain organizational aids.
One of the more difficult Level 3 tasks (with a difficulty value of 316)
requires the reader to read a magazine article about an Asian-American woman
and to provide two facts that support an inference made from the text. The
question directs the reader to identify what Ida Chen did to help resolve
conflicts due to discrimination.
1234567890123456789012
1234567890123456789012
1234567890123456789012

List two things that Chen became involved in or has
done to help resolve conflicts due to discrimination.

96 . . . . . . Appendix A

IDA CHEN is the first Asian-American woman to
become a judge of the Commonwealth of Pennsylvania.
She understands
discrimination because she
has experienced it herself.
Soft-spoken and eminently dignified,
Judge Ida Chen prefers hearing about a
new acquaintance rather than talking
about herself. She wants to know about
career plans, hopes, dreams, fears. She
gives unsolicited advice as well as
encouragement. She instills confidence.
Her father once hoped that she
would become a professor. And she
would have also made an outstanding
social worker or guidance counselor.
The truth is that Chen wears the caps of
all these professions as a Family Court
judge of the Court of Common Pleas of
Philadelphia County, as a participant in
public advocacy for minorities, and as a
particularly sensitive, caring person.
She understands discrimination
because she has experienced it herself.
As an elementary school student, Chen
tried to join the local Brownie troop.
‘‘You can’t be a member,’’ she was told.
‘‘Only American girls are in the
Brownies.’’
Originally intent upon a career as a
journalist, she selected Temple University because of its outstanding journalism department and affordable tuition.
Independence being a personal need, she
paid for her tuition by working for
Temple’s Department of Criminal
Justice. There she had her first encounter with the legal world and it turned
her career plans in a new direction —
law school.
Through meticulous planning, Chen
was able to earn her undergraduate
degree in two and a half years and she
continued to work three jobs. But when
she began her first semester as a Temple
law student in the fall of 1973, she was
barely able to stay awake. Her teacher
Lynne Abraham, now a Common Pleas
Court judge herself, couldn’t help but
notice Chen yawning in the back of the
class, and when she determined that
this student was not a party animal but
a workhorse, she arranged a teaching
assistant’s job for Chen on campus.
After graduating from Temple Law
School in 1976, Chen worked for the
U.S. Equal Employment Opportunity
Commission where she was a litigator
on behalf of plaintiffs who experienced
discrimination in the workplace, and

then moved on to become the first
Asian-American to serve on the
Philadelphia Commission on Human
Relations.
Appointed by Mayor Wilson Goode,
Chen worked with community leaders
to resolve racial and ethnic tensions and
also made time to contribute free legal
counsel to a variety of activist groups.
The ‘‘Help Wanted’’ section of the
newspaper contained an entry that
aroused Chen’s curiosity — an ad for a
judge’s position. Her application
resulted in her selection by a state
judicial committee to fill a seat in the
state court. And in July of 1988, she
officially became a judge of the Court of
Common Pleas. Running as both a
Republican and Democratic candidate,
her position was secured when she won
her seat on the bench at last November’s election.
At Family Court, Chen presides over
criminal and civil cases which include
adult sex crimes, domestic violence,
juvenile delinquency, custody, divorce
and support. Not a pretty picture.
Chen recalls her first day as judge,
hearing a juvenile dependency case —
‘‘It was a horrifying experience. I broke
down because the cases were so
depressing,’’ she remembers.
Outside of the courtroom, Chen has
made a name for herself in resolving
interracial conflicts, while glorying in
her Chinese-American identity. In a
1986 incident involving the desecration
of Korean street signs in a Philadelphia
neighborhood, Chen called for a
meeting with the leaders of that
community to help resolve the conflict.
Chen’s interest in community
advocacy is not limited to Asian
communities. She has been involved in
Hispanic, Jewish and Black issues, and
because of her participation in the
Ethnic Affairs Committee of the AntiDefamation League of B’nai B’rith,
Chen was one of 10 women nationwide
selected to take part in a mission to
Israel.
With her recently won mandate to
judicate in the affairs of Pennsylvania’s
citizens, Chen has pledged to work
tirelessly to defend the rights of its
people and contribute to the improvement of human welfare. She would have
made a fabulous Brownie.
— Jessica Schultz

1234567890123456789012
1234567890123456789012
1234567890123456789012

Appendix A . . . . . . 97

Prose Level 4

Scale range: 326 to 375

These tasks require readers to perform multiple-feature matches and
to integrate or synthesize information from complex or lengthy
passages. More complex inferences are needed to perform
successfully. Conditional information is frequently present in tasks in
this level and must be taken into consideration by the reader.
Average difficulty value of tasks in this level: 352
Percentage of adults performing in this level: 17%

A prose task with a difficulty value of 328 requires the reader to synthesize
the repeated statements of an argument from a newspaper column in order to
generate a theme or organizing principle. In this instance, the supporting
statements are elaborated in different parts of a lengthy text.
A more challenging task (with a difficulty value of 359) directs the reader
to contrast the two opposing views stated in the newspaper feature reprinted
here that discusses the existence of technologies that can be used to produce
more fuel-efficient cars.
1234567890123456789012
1234567890123456789012
1234567890123456789012
1234567890123456789012

Contrast Dewey’s and Hanna’s views about the
existence of technologies that can be used to
produce more fuel-efficient cars while maintaining
the size of the cars.

98 . . . . . . Appendix A

1234567890123456789012
1234567890123456789012
1234567890123456789012

Appendix A . . . . . . 99

Two other tasks in Level 4 on the prose scale require the reader to draw
on background knowledge in responding to questions asked about two poems.
In one they are asked to generate an unfamiliar theme from a short poem
(difficulty value of 362), and in the other they are asked to compare two
metaphors (value of 374).

Prose Level 5

Scale range: 376 to 500

Some tasks in this level require the reader to search for information in
dense text which contains a number of plausible distractors. Others
ask readers to make high-level inferences or use specialized
background knowledge. Some tasks ask readers to contrast complex
information.
Average difficulty value of tasks in this level: 423
Percentage of adults performing in this level: 3%

Two tasks in Level 5 require the reader to search for information in dense
text containing several plausible distractors. One such task (difficulty value of
410) requires the respondent to read information about jury selection and
service. The question requires the reader to interpret information to identify
two ways in which prospective jurors may be challenged.
1234567890123456789012
1234567890123456789012
1234567890123456789012
1234567890123456789012

Identify and summarize the two kinds of challenges
that attorneys use while selecting members of a jury.

100 . . . . . . Appendix A

DO YOU HAVE A QUESTION?
QUESTION: What is the new program for
scheduling jurors?
ANSWER: This is a new way of organizing
and scheduling jurors that is being introduced all over the country. The goals of
this program are to save money, increase
the number of citizens who are summoned
to serve and decrease the inconvenience
of serving.
The program means that instead of calling jurors for two weeks, jurors now serve
only one day, or for the length of one trial
if they are selected to hear a case. Jurors
who are not selected to hear a case are
excused at the end of the day, and their
obligations to serve as jurors are fulfilled
for three years. The average trial lasts
two days once testimony begins.
An important part of what is called the
One Day – One Trial program is the
‘’standby’’ juror. This is a person called to
the Courthouse if the number of cases to
be tried requires more jurors than originally estimated. Once called to the Courthouse, the standby becomes a ‘’regular”
juror, and his or her service is complete at
the end of one day or one trial, the same
as everyone else.

Q. How was I summoned?
A. The basic source for names of eligible
jurors is the Driver’s License list which is
supplemented by the voter registration
list. Names are chosen from these combined lists by a computer in a completely
random manner.
Once in the Courthouse, jurors are
selected for a trial by this same computer
and random selection process.
Q. How is the Jury for a particular trial
selected?
A. When a group of prospective jurors is
selected, more than the number needed
for a trial are called. Once this group has
been seated in the courtroom, either the
Judge or the attorneys ask questions.
This is called voir dire. The purpose of
questions asked during voir dire is to

ensure that all of the jurors who are
selected to hear the case will be unbiased, objective and attentive.
In most cases, prospective jurors will be
asked to raise their hands when a particular question applies to them. Examples of
questions often asked are: Do you know
the Plaintiff, Defendant or the attorneys in
this case? Have you been involved in a
case similar to this one yourself? Where
the answer is yes, the jurors raising hands
may be asked additional questions, as
the purpose is to guarantee a fair trial for
all parties. When an attorney believes
that there is a legal reason to excuse a
juror, he or she will challenge the juror for
cause. Unless both attorneys agree that
the juror should be excused, the Judge
must either sustain or override the challenge.
After all challenges for cause have been
ruled upon, the attorneys will select the
trial jury from those who remain by exercising peremptory challenges. Unlike
challenges for cause, no reason need be
given for excusing a juror by peremptory
challenge. Attorneys usually exercise
these challenges by taking turns striking
names from a list until both are satisfied
with the jurors at the top of the list or until
they use up the number of challenges
allowed. Challenged jurors and any extra
jurors will then be excused and asked to
return to the jury selection room.
Jurors should not feel rejected or insulted
if they are excused for cause by the Court
or peremptorily challenged by one of the
attorneys. The voir dire process and
challenging of jurors is simply our judicial
system’s way of guaranteeing both parties to a lawsuit a fair trial.

Q. Am I guaranteed to serve on a jury?
A. Not all jurors who are summoned actually
hear a case. Sometimes all the Judges
are still working on trials from the previous day, and no new jurors are chosen.
Normally, however, some new cases begin
every day. Sometimes jurors are challenged and not selected.

1234567890123456789012
1234567890123456789012
1234567890123456789012

Appendix A . . . . . . 101

A somewhat more demanding task (difficulty value of 423) involves the
magazine article on Ida Chen reproduced earlier. This more challenging task
requires the reader to explain the phrase “recently won mandate” used at the
end of the text. To explain this phrase, the reader needs to understand the
concept of a political mandate as it applies to Ida Chen and the way she is
portrayed in this article.
Document Literacy
Another important aspect of being literate in modern society is having the
knowledge and skills needed to process information from documents. We often
encounter tables, schedules, charts, graphs, maps, and forms in everyday life,
both at home and at work. In fact, researchers have found that many of us
spend more time reading documents than any other type of material.2 The
ability to locate and use information from documents is therefore essential.
Success in processing documents appears to depend at least in part on the
ability to locate information in complex arrays and to use this information in
the appropriate ways. Procedural knowledge may be needed to transfer
information from one source or document to another, as is necessary in
completing applications or order forms.
The NALS document literacy scale contains 81 tasks with difficulty values
that range from 69 to 396 on the scale. By examining tasks associated with
various proficiency levels, we can identify characteristics that appear to make
certain types of document tasks more or less difficult for readers. Questions
and directives associated with these tasks are basically of four types: locating,
cycling, integrating, and generating. Locating tasks require the readers to
match one or more features of information stated in the question to either
identical or synonymous information given in the document. Cycling tasks
require the reader to locate and match one or more features, but differ in that
they require the reader to engage in a series of feature matches to satisfy
conditions given in the question. The integrating tasks typically require the
reader to compare and contrast information in adjacent parts of the document.
In the generating tasks, readers must produce a written response by processing
information found in the document and also making text-based inferences or
drawing on their own background knowledge.

2

J.T. Guthrie, M. Seifert, and I.S. Kirsch. (1986). “Effects of Education, Occupation, and Setting on Reading
Practices.” American Educational Research Journal, 23. pp. 151-160.

102 . . . . . . Appendix A

As with the prose tasks, each type of question or directive extends over a
range of difficulty as a result of interactions among several variables or task
characteristics that include:
y

the number of categories or features of information in the question that the
reader has to process or match

y the number of categories or features of information in the document that
can serve to distract the reader or that may seem plausible but are incorrect
y the extent to which the information asked for in the question is obviously
related to the information stated in the document and
y the structure of the document
A more detailed discussion of the five levels of document literacy is
provided in the following pages.

Document Level 1

Scale range: 0 to 225

Tasks in this level tend to require the reader either to locate a piece of
information based on a literal match or to enter information from
personal knowledge onto a document. Little, if any, distracting
information is present.
Average difficulty value of tasks in this level: 195
Percentage of adults performing in this level: 23%

Some of the Level 1 tasks require the reader to match one piece of
information in the directive with an identical or synonymous piece of
information in the document. For example, readers may be asked to write a
piece of personal background information — such as their name or age — in
the appropriate place on a document. One task with a difficulty value of 69
directs individuals to look at a Social Security card and sign their name on the
line marked “signature.” Tasks such as this are quite simple, since only one
piece of information is required, it is known to the respondent, and there is
only one logical place on the document where it may be entered.

Appendix A . . . . . . 103

Here is a Social Security card. Sign your name on
the line that reads "signature".

Respondents were given a copy of a Social
Security card to complete this task.

Other tasks in this level are slightly more complex. For example, in one
task, readers were asked to complete a section of a job application by providing
several pieces of information. This was more complicated than the previous
task described, since respondents had to conduct a series of one-feature
matches. As a result, the difficulty value of this task was higher (193).

Y ou have gone to an em ploym ent center for help in finding a
job. Y ou k now that this center handles m any different k inds of
jobs. A lso, several of your friends who have applied here have
found jobs that appeal to you.
T he agent has tak en your nam e and address and given you
the rest of the form to fill out. C om plete the form so the
em ploym ent center can help you get a job.
Birth date______________ A ge____ Sex: M ale____ Fem ale____
H eight____________ W eight____________ H ealth____________
L ast grade com pleted in school_______________
K ind of work wanted:
Part-tim e___________

Su m m er___________

Full-tim e___________

Y ear-round___________

104 . . . . . . Appendix A

Other tasks in this level ask the reader to locate specific elements in a
document that contains a variety of information. In one task, for example,
respondents were given a form providing details about a meeting and asked to
indicate the date and time of the meeting, which were stated in the form. The
difficulty values associated with these tasks were 187 and 180, respectively. The
necessary information was referred to only once in the document.

Document Level 2

Scale range: 226 to 275

Tasks in this level are more varied than those in Level 1. Some require
the reader to match a single piece of information; however, several
distractors may be present, or the match may require low-level
inferences. Tasks in this level may also ask the reader to cycle through
information in a document or to integrate information from various
parts of a document.
Average difficulty value of tasks in this level: 249
Percentage of adults performing in this level: 28%

Some tasks in Level 2 ask readers to match two pieces of information in
the text. For example, one task with a difficulty value of 275 directs the
respondent to look at a pay stub and to write “the gross pay for this year to
date.” To perform the task successfully, respondents must match both “gross
pay” and “year to date” correctly. If readers fail to match on both features, they
are likely to indicate an incorrect amount.
1234567890123456789012
1234567890123456789012
1234567890123456789012

What is the gross pay for this year to date?
PERIOD ENDING

03 /15 / 8 5

HOURS
REGULAR

2ND SHIFT

OVERTIME

50 0

TOTAL

50 0

REGULAR

OVERTIME

62500

CURRENT

GROSS

DEF. ANN

NET PAY

62500
426885

YEAR TO DATE

45988

TAX DEDUCTIONS
FED. W/H

STATE W/H

CITY W/H

OTHER DEDUCTIONS

FICA
CR UNION

CURRENT
YEAR TO
DATE

10894
73498

1375
8250

NON-NEGOTIABLE

UNITED FD

PERS INS.

MISC.

3831
26167
CODE

07

1234567890123456789012
1234567890123456789012
1234567890123456789012

TYPE

DEN

OTHER DEDUCTIONS
AMOUNT
CODE

TYPE

MISC
CODE

AMOUNT

4 12

Reduced from original copy.

Appendix A . . . . . . 105

A second question based on this document — What is the current net
pay? — was also expected to require readers to make a two-feature match.
Accordingly, the difficulty values of the two items were expected to be similar.
The task anchored at about the 224 point on the scale, however, and an analysis
of the pay stub reveals why its difficulty was lower than that of the previous
task. To succeed on the second task, the reader only needs to match on the
feature “net pay.” Since the term appears only once on the pay stub and there
is only one number in the column, this task requires only a one-feature match
and receives a difficulty value that lies within the Level 1 range on the
document scale.
Tasks in Level 2 may also require the reader to integrate information from
different parts of the document by looking for similarities or differences. For
example, a task with a difficulty value of 260 asks respondents to study a line
graph showing a company’s seasonal sales over a three-year period, then predict
the level of sales for the following year, based on the seasonal trends shown in
the graph.
1234567890123456789012
1234567890123456789012
1234567890123456789012

You are a marketing manager for a small
manufacturing firm. This graph shows your
company’s sales over the last three years. Given the
seasonal pattern shown on the graph, predict the
sales for Spring 1985 (in thousands) by putting an “x”
on the graph.
1982

1983

1984

1985

Sales (in thousands of units)

80 –
70 –
60 –
50 –
40 –
30 –
20 –

1234567890123456789012
1234567890123456789012
1234567890123456789012

106 . . . . . . Appendix A

Spring –

Winter –

Fall –

Summer –

Spring –

Winter –

Fall –

Summer –

Spring –

Winter –

Fall –

Summer –

Spring –

10 –

Reduced from original copy.

Document Level 3

Scale range: 276 to 325

Some tasks in this level require the reader to integrate multiple pieces
of information from one or more documents. Others ask readers to
cycle through rather complex tables or graphs which contain
information that is irrelevant or inappropriate to the task.
Average difficulty value of tasks in this level: 302
Percentage of adults performing in this level: 31%

Tasks within the range for Level 3 ask the reader to locate particular
features in complex displays, such as tables that contain nested information.
Typically, distractor information is present in the same row or column as the
correct answer. For example, the reader might be asked to use a table that
summarizes appropriate uses for a variety of products, and then choose which
product to use for a certain project. One such task had a difficulty value of 303.
To perform this task successfully, the respondent uses a table containing nested
information to determine the type of sandpaper to buy if one needs “to smooth
wood in preparation for sealing and plans to buy garnet sandpaper.” This task
requires matching not only on more than a single feature of information but
also on features that are not always superordinate categories in the document.
For example, “preparation for sealing” is subordinated or nested under the
category “wood,” while the type of sandpaper is under the main heading of
“garnet.” In addition, there are three other types of sandpaper that the reader
might select that partially satisfy the directive.

Appendix A . . . . . . 107

1234567890123456789012
1234567890123456789012
1234567890123456789012

You need to smooth wood in preparation for sealing
and plan to buy garnet sandpaper. What type of
sandpaper should you buy?

MATERIAL & OPERATION

EC

ABRASIVE SELECTION GUIDE
PRODUCTIONT
GARNET
C
M
F
EF
C
M
F
EF

VF

WETORDRYT
EF
SF UF

FRE-CUTT
VF
EF

C

EMERY
M
F

WOOD
Paint Removal
Heavy Stock Removal
Moderate Stock Removal
Preparation for Sealing
After Sealer
Between Coats
After Final Coat
METAL
Rust and Paint Removal
Light Stock Removal
Preparation for Priming
Finishing and Polishing
After Primer
Between Coats
After Final Coat
PLASTIC & FIBERGLASS
Shaping
Light Stock Removal
Finishing & Scuffing
EC = Extra Coarse

C = Coarse

M = Medium

SAFETY INFORMATION:
n Wear approved safety goggles
when sanding.

F = Fine

VF = Very Fine

n Use particle/dust mask or other
means to prevent inhalation of
sanding dust.

EF = Extra Fine

SF = Super Fine

UF = Ultra Fine

n When using power tools, follow
manufacturer’s recommended
procedures and safety instructions.

Reprint by permission of and copyrighted by the 3M Co.

Reduced from original copy.

1234567890123456789012
1234567890123456789012
1234567890123456789012
1234567890123456789012

At the same level of difficulty (307), another task directs the reader to a
stacked bar graph depicting estimated power consumption by source for four
different years. The reader is asked to select an energy source that will provide
more power in the year 2000 than it did in 1971. To succeed on this task, the
reader must first identify the correct years and then compare each of the five
pairs of energy sources given.

Document Level 4

Scale range: 326 to 375

Tasks in this level, like those in the previous levels, ask readers to
perform multiple-feature matches, cycle through documents, and
integrate information; however, they require a greater degree of
inferencing. Many of these tasks require readers to provide numerous
responses but do not designate how many responses are needed.
Conditional information is also present in the document tasks in this
level and must be taken into account by the reader.
Average difficulty value of tasks in this level: 340
Percentage of adults performing in this level: 15%

108 . . . . . . Appendix A

One task in this level (348) combines many of the variables that contribute
to difficulty in Level 4. These include: multiple feature matching, complex
displays involving nested information, numerous distractors, and conditional
information that must be taken into account in order to arrive at a correct
response. Using the bus schedule shown here, readers are asked to select the
time of the next bus on a Saturday afternoon, if they miss the 2:35 bus leaving
Hancock and Buena Ventura going to Flintridge and Academy. Several
departure times are given, from which respondents must choose the correct one.
1234567890123456789012
1234567890123456789012
1234567890123456789012

On Saturday afternoon, if you miss the 2:35 bus
leaving Hancock and Buena Ventura going to
Flintridge and Academy, how long will you have to
wait for the next bus?
ROUTE

5

VISTA GRANDE
This bus line operates Monday through Saturday providing “local service”
to most neighborhoods in the northeast section.
Buses run thirty minutes apart during the morning and afternoon rush hours Monday through Friday.
Buses run one hour apart at all other times of day and Saturday.
No Sunday, holiday or night service.

OUTBOUND

INBOUND

from Terminal

Leave
Downtown
Terminal

Leave
Hancock
and
Buena
Ventura

Leave
Citadel

Leave
Rustic
Hills

toward Terminal

Leave
North
Carefree
and
Oro Blanco

Leave
Flintridge
and
Academy

6:15
6:45
7:15
7:45
8:15
8:45
9:15
9:45
10:15
11:15
12:15

6:27
6:57
7:27
7:57
8:27
8:57
9:27
9:57
10:27
11:27
12:27

1:15
2:15
3:15
3:45
4:15
4:45
5:15
5:45

1:27
2:27
3:27
3:57
4:27
4:57
5:27
5:57

7:03
7:33
8:03
8:33
9:03
9:33
10:03
11:03
12:03

7:15
7:45
8:15
8:45
9:15
9:45
10:15
11:15
12:15

12:20 12:35 12:45 12:50
1:20 1:35 1:45 1:50
2:20 2:35 2:45 2:50
2:50 3:05 3:15 3:20
3:20 3:35 3:45 3:50
3:50 4:05 4:15 4:20
4:20 4:35 4:45 4:50
4:50 5:05 5:15 5:20
5:20 5:35 5:45 5:50
5:50 6:05 6:15 6:20
6:20 6:35 6:45 6:50

1:03
2:03
3:03
3:33
4:03
4:33
5:03
5:33
6:03
6:33
7:03

1:15
2:15
3:15
3:45
4:15
4:45
5:15
5:45
6:15
6:45
7:15

PM

Leave
North
Carefree
and
Oro Blanco

Arrive
Flintridge
and
Academy

6:20 6:35 6:45 6:50
6:50 7:05 7:15 7:20
7:20 7:35 7:45 7:50
7:50 8:05 8:15 8:20
8:20 8:35 8:45 8:50
8:50 9:05 9:15 9:20
9:20 9:35 9:45 9:50
10:20 10:35 10:45 10:50
11:20 11:35 11:45 11:50

AM

You can transfer from this bus
to another headed anywhere
else in the city bus system

Leave
Rustic
Hills

6:42
7:12
7:42
8:12
8:42
9:12
9:42
10:12
10:42
11:42
12:42 p.m.

1:42
2:42
3:42
4:12
4:42
4:12
5:42
6:12

Leave
Citadel

Leave
Hancock
and
Buena
Ventura

Arrive
Downtown
Terminal

6:47
7:17
7:47
8:17
8:47
9:17
9:47
10:17
10:47
11:47
12:47 p.m.

6:57
7:27
7:57
8:27
8:57
9:27
9:57
10:27
10:57
11:57
12:57 p.m.

1:47
2:47
3:47
4:17
4:47
4:17
5:47
6:17

1:57
2:57
3:57
4:27
4:57
5:27
5:57
6:27

7:15
7:45 Monday through Friday only
8:15
8:45 Monday through Friday only
9:15
9:45 Monday through Friday only
10:15
10:45 Monday through Friday only
11:15
12:15
1:15 p.m.

2:15
3:15
4:15
4:45
5:15
5:45
6:15
6:45

Monday through Friday only
Monday through Friday only
Monday through Friday only
Monday through Friday only

To be sure of a smooth transfer
tell the driver of this buss the name
of the second bus you need.

1234567890123456789012
1234567890123456789012
1234567890123456789012

Appendix A . . . . . . 109

Other tasks involving this bus schedule are found in Level 3. These tasks
require the reader to match on fewer features of information and do not
involve the use of conditional information.

Document Level 5

Scale range: 376 to 500

Tasks in this level require the reader to search through complex
displays that contain multiple distractors, to make high-level textbased inferences, and to use specialized knowledge.
Average difficulty value of tasks in this level: 391
Percentage of adults performing in this level: 3%

A task receiving a difficulty value of 396 involves reading and
understanding a table depicting the results from a survey of parents and
teachers evaluating parental involvement in their school. Respondents were
asked to write a brief paragraph summarizing the results. This particular task
requires readers to integrate the information in the table to compare and
contrast the viewpoints of parents and teachers on a selected number of
school issues.
1234567890123456789012
1234567890123456789012
1234567890123456789012

Using the information in the table, write a brief
paragraph summarizing the extent to which parents
and teachers agreed or disagreed on the statements
about issues pertaining to parental involvement at
their school.

110 . . . . . . Appendix A

Parents and Teachers Evaluate Parental
Involvement at Their School
Do you agree or disagree that . . . ?
Level of School
Total

Elementary

Junior High

High School

percent agreeing
Our school does a good job of
encouraging parental involvement in
sports, arts, and other nonsubject areas
Parents
Teachers

77
77

76
73

74
77

79
85

73
80

82
84

71
78

64
70

55
23

46
18

62
22

63
33

22
8

18
8

22
12

28
7

Our school does a good job of
encouraging parental involvement in
educational areas
Parents
Teachers
Our school only contacts parents
when there is a problem with their child
Parents
Teachers
Our school does not give parents the
opportunity for any meaningful roles
Parents
Teachers

Source: The Metropolitan Life Survey of the American Teacher, 1987

1234567890123456789012
1234567890123456789012
1234567890123456789012
1234567890123456789012

Quantitative Literacy
Since adults are often required to perform numerical operations in everyday
life, the ability to perform quantitative tasks is another important aspect of
literacy. These abilities may seem, at first glance, to be fundamentally different
from the types of skills involved in reading prose and documents and,
therefore, to extend the concept of literacy beyond its traditional limits.
However, research indicates that the processing of printed information plays a
critical role in affecting the difficulty of tasks along this scale.3

3

I.S. Kirsch and A. Jungeblut. (1986). Literacy: Profiles of America’s Young Adults, Final Report.
Princeton, NJ: Educational Testing Service. I.S. Kirsch, A. Jungeblut, and A. Campbell. (1992). Beyond the
School Doors: The Literacy Needs of Job Seekers Served by the U.S. Department of Labor.
Princeton, NJ: Educational Testing Service.

Appendix A . . . . . . 111

The NALS quantitative literacy scale contains some 43 tasks with difficulty
values that range from 191 to 436. The difficulty of these tasks appears to be a
function of several factors, including:
y

the particular arithmetic operation called for

y

the number of operations needed to perform the task

y the extent to which the numbers are embedded in printed materials and
y

the extent to which an inference must be made to identify the type of
operation to be performed

In general, it appears that many individuals can perform simple arithmetic
operations when both the numbers and operations are made explicit. However,
when the numbers to be used must be located in and extracted from different
types of documents that contain similar but irrelevant information, or when the
operations to be used must be inferred from printed directions, the tasks
become increasingly difficult.
A detailed discussion of the five levels of quantitative literacy is provided
on the following pages.

Quantitative Level 1

Scale range: 0 to 225

Tasks in this level require readers to perform single, relatively simple
arithmetic operations, such as addition. The numbers to be used are
provided and the arithmetic operation to be performed is specified.
Average difficulty value of tasks in this level: 206
Percentage of adults performing in this level: 22%

The least demanding task on the quantitative scale (191) requires the
reader to total two numbers on a bank deposit slip. In this task, both the
numbers and the arithmetic operation are judged to be easily identified and the
operation involves the simple addition of two decimal numbers that are set up
in column format.

112 . . . . . . Appendix A

1234567890123456789012
1234567890123456789012
1234567890123456789012

You wish to use the automatic teller machine at your
bank to make a deposit. Figure the total amount of
the two checks being deposited. Enter the amount
on the form in the space next to TOTAL.
Availability of Deposits
Funds from deposits may not be available for immediate withdrawal. Please refer to
your institution’s rules governing funds availability for details.
Crediting of deposits and payments is subject to verification and collection of actual amounts
deposited or paid in accordance with the rules and regulations of your financial institution.
PLEASE PRINT

111 222 333 4

CASH
LIST CHECKS
BY BANK NO.

YOUR FINANCIAL INSTITUTION

$

00

ENDORSE WITH NAME
& ACCOUNT NUMBER

Union Bank

557 19
75 00

YOUR ACCOUNT NUMBER

987 555 674
YOUR NAME

Chris Jones
CHECK ONE

⌴ DEPOSIT
or
⌴ PAYMENT

DO NOT FOLD

DO NOT
DETACH TICKET

YOUR MAC CARD NUMBER (No PINs PLEASE)

TOTAL

NO COINS OR PAPER CLIPS PLEASE
1234567890123456789012
1234567890123456789012
1234567890123456789012

Quantitative Level 2

Scale range: 226 to 275

Tasks in this level typically require readers to perform a single
operation using numbers that are either stated in the task or easily
located in the material. The operation to be performed may be stated
in the question or easily determined from the format of the material
(for example, an order form).
Average difficulty value of tasks in this level: 251
Percentage of adults performing in this level: 25%

In the easier tasks in Level 2, the quantities are also easy to locate. In one
such task at 246 on the quantitative scale, the cost of a ticket and bus is given
for each of two shows. The reader is directed to determine how much less
attending one show will cost in comparison to the other.

Appendix A . . . . . . 113

1234567890123456789012
1234567890123456789012
1234567890123456789012

The price of one ticket and bus for “Sleuth” costs
how much less than the price of one ticket and bus
for “On the Town”?

THEATER TRIP
A charter bus will leave from the bus stop (near the Conference Center)
at 4 p.m., giving you plenty of time for dinner in New York. Return trip
will start from West 45th Street directly following the plays. Both theaters
are on West 45th Street. Allow about 11⁄2 hours for the return trip.
Time: 4 p.m., Saturday, November 20
Price: “On the Town”
Ticket and bus
$11.00
“Sleuth”
Ticket and bus
$8.50
Limit: Two tickets per person
1234567890123456789012
1234567890123456789012
1234567890123456789012

In a more complex set of tasks, the reader is directed to complete an order
form for office supplies using a page from a catalogue. No other specific
instructions as to what parts of the form should be completed are given in the
directive. One task (difficulty value of 270) requires the reader to use a table on
the form to locate the appropriate shipping charges based on the amount of a
specified set of office supplies, to enter the correct amount on an order form,
and then to calculate the total price of the supplies.

Quantitative Level 3

Scale range: 276 to 325

In tasks in this level, two or more numbers are typically needed to
solve the problem, and these must be found in the material. The
operation(s) needed can be determined from the arithmetic relation
terms used in the question or directive.
Average difficulty value of tasks in this level: 293
Percentage of adults performing in this level: 31%

114 . . . . . . Appendix A

In general, tasks within the range for Level 3 ask the reader to perform a
single operation of addition, subtraction, multiplication, or division. However,
the operation is not stated explicitly in the directive or made clear by the
format of the document. Instead, it must be inferred from the terms used in
the directive. These tasks are also more difficult because the reader must locate
the numbers in various parts of the document in order to perform the
operation.
From a bar graph showing percentages of population growth for two
groups across six periods, a task at the 279 point on the scale directs the reader
to calculate the difference between the groups for one of the years.
A more difficult task in Level 3 (321) requires the use of a bus schedule to
determine how long it takes to travel from one location to another on a
Saturday. To respond correctly, the reader must match on several features of
information given in the question to locate the appropriate times.
1234567890123456789012
1234567890123456789012
1234567890123456789012
1234567890123456789012

Suppose that you took the 12:45 p.m. bus from
U.A.L.R. Student Union to 17th and Main on a
Saturday. According to the schedule, how many
minutes is the bus ride?

Appendix A . . . . . . 115

1234567890123456789012
1234567890123456789012
1234567890123456789012

116 . . . . . . Appendix A

Quantitative Level 4

Scale range: 326 to 375

These tasks tend to require readers to perform two or more sequential
operations or a single operation in which the quantities are found in
different types of displays, or the operations must be inferred from
semantic information given or drawn from prior knowledge.
Average difficulty value of tasks in this level: 349
Percentage of adults performing in this level: 17%

One task in this level, with a difficulty value of 332, asks the reader to
estimate, based on information in a news article, how many miles per day a
driver covered in a sled-dog race. The respondent must know that to calculate
a “per day” rate requires the use of division.
A more difficult task (355) requires the reader to select from two unit
price labels to estimate the cost per ounce of creamy peanut butter. To perform
this task successfully, readers may have to draw some information from prior
knowledge.
1234567890123456789012
1234567890123456789012
1234567890123456789012
1234567890123456789012

Estimate the cost per ounce of the creamy peanut
butter. Write your estimate on the line provided.
Unit price

You pay

11.8¢ per oz.

1.89

rich chnky pnt bt

10693

0

16 oz.
51144 09071

Unit price

You pay

1.59 per lb.

1.99

creamy pnt butter

10732

0

20 oz.
51144 09071

1234567890123456789012
1234567890123456789012
1234567890123456789012

Appendix A . . . . . . 117

Quantitative Level 5

Scale range: 376 to 500

These tasks require readers to perform multiple operations
sequentially. They must disembed the features of the problem from
text or rely on background knowledge to determine the quantities or
operations needed.
Average difficulty value of tasks in this level: 411
Percentage of adults performing in this level: 4%

One of the most difficult tasks on the quantitative scale (433) requires
readers to look at an advertisement for a home equity loan and then, using the
information given, explain how they would calculate the total amount of
interest charges associated with the loan.
1234567890123456789012
1234567890123456789012
1234567890123456789012

You need to borrow $10,000. Find the ad for Home
Equity Loans on page 2 in the newspaper provided.
Explain to the interviewer how you would compute
the total amount of interest charges you would pay
under this loan plan. Please tell the interviewer
when you are ready to begin.

14.25%
SAMPLE MONTHLY REPAYMENT SCHEDULE
Amount Financed

Monthly Payment

$10,000
$25,000
$40,000

$156.77
$391.93
$627.09

120 Months 14.25% APR

1234567890123456789012
1234567890123456789012
1234567890123456789012
1234567890123456789012

118 . . . . . . Appendix A

Reduced from original copy.

Successful Task Performance across the Literacy Levels
The main purpose of the literacy scales is to summarize how well adults can
perform on the full array of tasks in the assessment. The difficulty of the
assessment tasks increases proportionally with the progression of informationprocessing demands across the scales. The literacy levels provide a way not only
to explore this progression, but also to explore the likelihood that individuals in
each level will succeed on tasks of varying difficulty.
The following graphs (Figure A.2) display the probability that individuals
performing at selected points on each scale will give a correct response to tasks
with varying difficulty values. For example, a person whose prose proficiency is
150 has less than a 50 percent chance of giving a correct response to an average
prose task in Level 1, where the average task difficulty is 198. Individuals
whose scores were at the 200 point, on the other hand, have an almost 80
percent probability of responding correctly to these tasks.
In terms of task demands, adults performing at the 200 point on the prose
scale are likely to be able to locate a single piece of information in a brief piece
of text where there is no distracting information, or when any distracting
information is located apart from the desired information. They are likely to
have far more difficulty with the types of tasks that occur in Levels 2 through 5,
however. For example, they would have only about a 30 percent chance of
performing the average task in Level 2 correctly, where the average task
difficulty value is 259, and only about a 10 percent chance of success, or less,
on the more challenging tasks found in Levels 3, 4, and 5.
In contrast, readers at the 300 point on the prose scale have more than an
80 percent probability of success on tasks in Levels 1 and 2, and have close to
an 80 percent likelihood of success on tasks in Level 3, where the average task
difficulty value is 298. This means that they demonstrate consistent success
identifying information in fairly dense text without organizational aids. They
can also consistently integrate, compare, and contrast information that is easily
identified in the text. On the other hand, they are likely not to have mastered
tasks that require them to make higher level inferences, to take conditional
information into account, and to use specialized knowledge. The probabilities
of their successfully performing these Level 4 tasks, where the average task
difficulty value is 352, are just under 50 percent, and on the Level 5 tasks their
likelihood of responding correctly falls to less than 20 percent.
Similar interpretations can be made using the performance results on the
document and quantitative scales. For example, an individual with a
proficiency of 150 on the document scale is estimated to have less than a 50
percent chance of responding correctly to tasks in Level 1, where the average
task difficulty value is 195, and less than a 30 percent chance of responding

Appendix A . . . . . . 119

Figure A.2
3.2

NALS

Average Probabilities of Successful Performance by Individuals with Selected Proficiency
Scores on the Tasks in Each Literacy Level
PROSE
1.0
0.9

Average Probability

0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.0
Level 1
tasks

Level 2
tasks

Level 3
tasks

Level 4
tasks

Level 5
tasks

Level 2
tasks

Level 3
tasks

Level 4
tasks

Level 5
tasks

Level 3
tasks

Level 4
tasks

Level 5
tasks

DOCUMENT
1.0
0.9

Average Probability

0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.0
Level 1
tasks

QUANTITATIVE
1.0
0.9

Average Probability

0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0.0
Level 1
tasks

Adults' Proficiency Scores:

Level 2
tasks

150

200

250

300

350

400

Source: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey, 1992.

120 . . . . . . Appendix A

correctly to tasks in each of the higher levels. On the quantitative literacy scale,
adults with a proficiency of 150 are estimated to have only a 50 percent chance
of responding correctly to an average document task in Level 1, where the
average task difficulty is 206, and less than a 30 percent chance of responding
correctly to tasks in the other levels. Such individuals demonstrate little or no
proficiency in performing the range of quantitative tasks found in this
assessment. In contrast, adults with a quantitative score of 300 exceed the 80
percent criterion for the average tasks in Levels 1 and 2 and meet the 80
percent criterion for many of the tasks in Level 3. They can be expected to
encounter more difficulty with quantitative tasks in Levels 4 and 5.

Missing Responses to Literacy Tasks
In any educational, social, or political opinion survey, missing responses are
always present. Sometimes missing data can be ignored when tabulating and
reporting survey results. If the reasons the data are missing are related to the
outcome of the study, however, the missing responses will bias the results
unless some adjustment can be made to counter the bias. In this survey, there
were reasons to believe that the literacy performance data were missing more
often for adults with lower levels of literacy than for adults with higher levels.
Field test evidence and experience with surveys indicated that adults with
lower levels of literacy would be more likely than adults with higher
proficiencies either to decline to respond to the survey at all or to begin the
assessment but not to complete it. Ignoring the pattern of missing data would
have resulted in overestimating the literacy skills of adults in the United States.
For this survey, several procedures were developed to reduce biases due
to nonresponse, based on how much of the survey the respondent completed.3
Individuals who refused to participate in the survey before any information
about them was collected were omitted from the analyses. Because they were
unlikely to know that the survey intended to assess their literacy, it was
assumed that their reason for refusing was not related to their level of literacy
skills.
Some individuals began the interview, but stopped before they completed
at least five tasks on each literacy scale.4 The interviewers were trained to
record accurately their reasons for stopping. The reasons were subsequently
3

For a full discussion of the procedures used in scoring, scaling, weighting, and handling nonresponse
problems, see the forthcoming Technical Report of the 1992 National Adult Literacy Survey.

4

Five was the minimum number of completed tasks needed for accurate proficiency estimation. No special
procedures were needed to estimate the proficiencies of those who broke off the assessment after
attempting five or more tasks on each scale.

Appendix A . . . . . . 121

classified as either related or unrelated to literacy skills. Literacy-related
reasons included difficulty with reading or writing, inability to read or write in
English, and mental or learning disabilities. Reasons unrelated to literacy
included physical disabilities, time conflicts, and interruptions. Some adults
gave no reason for stopping the assessment.
Overall, 88 percent of respondents completed the assessment (at least five
tasks on each literacy scale). Twelve percent started the survey but stopped
before completing five tasks. About half of these individuals, or 6 percent of
the adult population, did not complete the assessment for reasons related to
their literacy skills, while the other 6 percent did not complete it for reasons
unrelated to literacy or for no stated reason.
The missing data were treated differently depending on whether
nonrespondents’ reasons were related or unrelated to their literacy skills. The
missing responses of those who gave literacy-related reasons for terminating
the assessment were treated as wrong answers, based on the assumption that
they could not have correctly completed the literacy tasks. The missing
responses of those who broke off the assessment for no stated reason or for
reasons unrelated to literacy were essentially ignored, since it could not be
assumed that their answers would have been either correct or incorrect. The
proficiencies of such respondents were inferred from the performance of other
adults with similar characteristics.
Table A.1 shows the proficiency scores resulting from these procedures.
Adults who completed the assessment had average proficiencies ranging from
279 to 285 on the three literacy scales. Because the missing responses of adults
who did not complete the assessment for reasons related to literacy were
treated as wrong answers, the average scores of these adults were considerably
lower, ranging from 114 to 124. Nearly all adults who terminated the
assessment for literacy-related reasons scored in the Level 1 range (below 225).
Adults who stopped for other reasons or for unstated reasons had scores
between those of the other two groups, ranging from 228 to 237. These adults
were not found only in the lowest literacy level, but were distributed across the
five levels.
It is likely that there were some errors in classifying nonrespondents’
reasons for not completing the assessment. Some adults may have given an
explanation that reflected badly on their literacy skills simply because they
found completing the assessment too burdensome. Perhaps they could have
performed better if they had they tried harder. The assumption that such adults
are unable to succeed with the literacy tasks may be too strong, and the
assignment of wrong answers may underestimate their skills. Other adults may
have anticipated failure in the assessment, yet concealed their lack of literacy

122 . . . . . . Appendix A

Table A.1: Percentages and average proficiencies of adults on each
scale, by assessment completion status

Assessment completion status CPCT
Total
100
Completed assessment
88
Did not complete assessment
for literacy-related reasons
6
Did not complete assessment
for reasons unrelated to literacy 6

Literacy scale
Prose
Document Quantitative
PROF (se) PROF (se) PROF (se)
272 (0.6)
285 (0.6)

267 (0.7)
279 (0.6)

271 (0.7)
284 (0.6)

124 (1.5)

116 (1.4)

114 (1.9)

237 (3.0)

228 (2.8)

231 (3.6)

Notes: CPCT = column percentage; PROF = average proficiency; se = standard error.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National
Adult Literacy Survey, 1992.

skills by citing other reasons for not responding, or by refusing to explain their
reason. The assumption that these adults are just like others in their
demographic group may also be too strong, and the failure to assign wrong
answers may overestimate their skills. To some extent the errors can be
expected to counterbalance one another, but the available data are insufficient
to assess which kind of classification error occurred more often.

Performance in the Lowest Literacy Level
Level 1 is somewhat different from the other literacy levels. For Levels 2
through 5, adults who can consistently perform the tasks in a given level (that
is, at least 80 percent of the time) are said to perform in that level. For
example, adults in Level 2 have a high probability of success on the tasks in that
level, and more than an 80 percent likelihood of success on the Level 1 tasks.
Likewise, adults in Level 3 have a high probability of success on the tasks in
that level, as well as on the tasks in Levels 1 and 2.
Level 1, on the other hand, includes adults with a wide range of literacy
skills, including some who performed the Level 1 tasks consistently and others
who did not. Individuals who do not have an 80 percent probability of success
with Level 1 tasks are still grouped in Level 1. Thus, some but not all adults in
this level met the relatively undemanding requirements of the Level 1 tasks.
This section describes how many adults in Level 1 did not meet the demands of
the tasks in this level.

Appendix A . . . . . . 123

The failure to perform correctly at least one of the literacy tasks can be
taken as an indicator of not being able to meet the demands of tasks in Level 1.
Table A.2 provides information on the size of the groups that met or did not
meet the relatively undemanding requirements of the Level 1 tasks.
Most adults in the lowest literacy level on each scale performed at least
one literacy task correctly. Nearly three-quarters (72 percent) of adults in Level
1 on the prose scale performed at least one task correctly, as did 83 percent of
those in Level 1 on the document scale and 66 percent of those in Level 1 on
the quantitative scale. The difference in performance among the scales occurs
because the least difficult document task had a value of 68, while the least
difficult prose task had a value of 149 and the least difficult quantitative task
had a value of 191.

Table A.2: Percentages and average proficiencies on each scale of
adults in Level 1

Performance
Total in Level 1
At least one task correct
No tasks correct
No performance data

Prose
CPCT PROF
100
72
21
7

173
190
113
177

Literacy scale
Document
Quantitative
CPCT PROF
CPCT PROF
100
83
11
6

172
182
94
177

100
66
26
8

167
190
110
159

Notes: CPCT = column percentage; PROF = average proficiency.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National
Adult Literacy Survey, 1992.

A small proportion of adults in Level 1 did not perform any literacy tasks
correctly. Some of these adults completed the survey, while others did not for
literacy-related or other reasons. Those who did not succeed on any literacy
tasks constitute 21 percent of adults in Level 1 on the prose scale, 11 percent of
adults in Level 1 on the document scale, and 26 percent of adults in Level 1 on
the quantitative scale. There are wide disparities in average proficiencies
between those who performed at least one task correctly (182 to 190 across the
scales) and those who did not (94 to 113 across the scales).
For some adults in Level 1 (6 to 8 percent) there are no literacy
performance data because they did not respond to any of the literacy tasks for
reasons unrelated to their literacy skills or for unknown reasons. These persons
could not be described as either meeting or failing to meet the demands of the
literacy tasks, so they are distinguished as a separate group. Their proficiencies

124 . . . . . . Appendix A

were inferred from the performance of other adults with similar demographic
backgrounds and fell in the middle range between the other two groups.
Nearly all adults who correctly responded to at least one literacy task also
completed the assessment. Still, some adults broke off the assessment after
already having shown some initial success. Table A.3 divides adults in Level 1
who were successful with at least one task into two groups: those who
completed the assessment (at least five literacy tasks) and those who did not.
Across the scales, from 83 to 90 percent of those in Level 1 who correctly
responded to at least one task also completed the assessment. Their average
scores ranged from 192 to 196. The remainder (10 to 17 percent) performed at
least one task correctly before breaking off the assessment. Their average
scores were much lower, ranging from 132 to 153.

Table A.3: Percentages and average proficiencies of adults in Level 1
with at least one task correct, by assessment completion status

Completion status

Prose
CPCT PROF

Literacy scale
Document
Quantitative
CPCT PROF
CPCT PROF

Total in Level 1 with
at least one task correct

100

190

100

182

100

190

Completed assessment

87

196

83

192

90

194

Did not complete
assessment

13

153

17

132

10

153

Notes: CPCT = column percentage; PROF = average proficiency.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National
Adult Literacy Survey, 1992.

The population of adults who scored in Level 1 on each scale includes not
only those who demonstrated success with at least some of the tasks in Level 1
— who constituted the majority — but also those who did not succeed with any
of the tasks in this level. Nearly all of those in Level 1 who did not perform any
literacy tasks correctly also failed to complete the assessment (86 to 98
percent), as shown in table A.4. Their average scores range from 93 to 107
across the scales. Most of these adults either did not start or broke off the
assessment for literacy-related reasons, so that any literacy tasks that remained
unanswered were treated as incorrect.

Appendix A . . . . . . 125

Table A.4: Percentages and average proficiencies of adults in Level 1
with no tasks correct, by assessment completion status

Completion status
Total in Level 1 with
no tasks correct

Prose
CPCT PROF

Literacy scale
Document
Quantitative
CPCT PROF
CPCT PROF

100

113

100

94

100

110

Completed assessment

14

148

2

----

14

146

Did not complete
assessment

86

107

98

93

86

98

Notes: CPCT = column percentage; PROF = average proficiency.
---- indicates that the cell size is too small to provide reliable proficiency estimates.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National
Adult Literacy Survey, 1992.

Two to 14 percent of the adults in Level 1 who did not succeed on any of
the literacy tasks did, in fact, complete the assessment. Their average scores
were 148 on the prose scale and 146 on the quantitative scale; too few cases
were available to estimate an average document score.
The pattern of Level 1 proficiencies associated with various combinations
of missing and incorrect answers shows the consequences of including, rather
than excluding, adults who did not complete the assessment for literacy-related
reasons. In general, the very low scores of these adults bring down the average
for any group in which they are a significant component. Omitting these
persons from the assessment would have resulted in inflated estimates of the
literacy skills of the adult population overall and particularly of certain
subgroups.

Population Diversity within the Lowest Literacy Level
Certain populations of adults were disproportionately likely not to meet the
demands of the Level 1 tasks. This section describes the characteristics of
adults in Level 1 who did not meet the relatively undemanding requirements of
the tasks in this level. Tables A.5P, D, and Q provide information on the
demographic composition of the total adult population in this country, of adults
in Level 1 on each literacy scale, and of those adults in Level 1 who did not
succeed on any of the assessment tasks.

126 . . . . . . Appendix A

Table A.5P: Percentages of adults in selected groups, by membership
in total U.S. population, in Level 1, and in Level 1 with no tasks correct
Prose scale

Population group

Total U.S.
population
CPCT

Level 1
population
CPCT

Level 1
no tasks
correct
CPCT

Weighted sample size
(in millions)
Country of birth

191.3

40.0

8.2

10

25 (1.3)

55 (2.2)

10
13

35 (1.6)
27 (1.3)

61 (2.3)
17 (1.5)

30

24 (1.4)

14 (1.5)

76
11

51 (0.6)
20 (1.0)

29 (2.3)
15 (1.4)

Hispanic
Asian/Pacific Islander
Age

10
2

23 (1.4)
4 (3.9)

49 (2.1)
5 (0.9)

16 to 24 years
65 years and older

18
16

13 (0.8)
33 (1.5)

10 (1.2)
28 (1.8)

Disability or condition
Any condition

12

26 (1.0)

26 (1.7)

Visual difficulty
Hearing difficulty

7
7

19 (1.5)
13 (1.6)

20 (1.5)
13 (2.0)

Learning disability

3

9 (2.1)

15 (1.4)

Born in another country
Highest level of education
0 to 8 years
9 to 12 years
HS diploma or GED
Race/Ethnicity
White
Black

Notes: CPCT = column percentage; se = standard error.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National
Adult Literacy Survey, 1992.

Appendix A . . . . . . 127

Table A.5D: Percentages of adults in selected groups, by membership
in total U.S. population, in Level 1, and in Level 1 with no tasks correct
Document scale

Population group
Weighted sample size
(in millions)
Country of birth
Born in another country
Highest level of education
0 to 8 years
9 to 12 years
HS diploma or GED
Race/Ethnicity
White
Black
Hispanic
Asian/Pacific Islander
Age
16 to 24 years
65 years and older
Disability or condition
Any condition
Visual difficulty
Hearing difficulty
Learning disability

Total U.S.
population
CPCT

Level 1
population
CPCT

Level 1
no tasks
correct
CPCT

191.3

44.0

4.7

10

22 (1.3)

67 (3.2)

10
13
30

33 (1.5)
26 (1.5)
26 (1.7)

65 (3.1)
12 (1.7)
13 (2.1)

76
11
10
2

54 (0.7)
20 (0.9)
21 (1.7)
3 (3.2)

21 (3.0)
9 (1.1)
62 (3.2)
5 (1.6)

18
16

11 (0.6)
35 (1.5)

11 (1.8)
25 (2.2)

12
7
7
3

26 (1.2)
18 (1.3)
13 (2.0)
8 (2.3)

22 (2.5)
17 (2.3)
12 (2.0)
14 (1.6)

Notes: CPCT = column percentage; se = standard error.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National
Adult Literacy Survey, 1992.

While 10 percent of the adult population reported that they were born in
another country, from 22 to 25 percent of the individuals who performed in
Level 1 on the three scales and 54 to 67 percent of those in Level 1 who did
not perform any tasks correctly were foreign born. Some of these individuals
were undoubtedly recent immigrants with a limited command of English.

128 . . . . . . Appendix A

Table A.5Q: Percentages of adults in selected groups, by membership
in total U.S. population, in Level 1, and in Level 1 with no tasks correct
Quantitative scale

Population group

Total U.S.
population
CPCT

Level 1
population
CPCT

Level 1
no tasks
correct
CPCT

191.3

42.0

10.6

10

22 (1.2)

54 (2.0)

10
13
30

33 (1.6)
27 (1.5)
25 (1.6)

58 (2.5)
20 (1.5)
13 (1.3)

76
11
10
2

50 (0.5)
23 (0.9)
22 (1.3)
3 (3.6)

34 (2.2)
19 (1.2)
40 (1.9)
5 (0.9)

18
16

14 (0.8)
32 (1.5)

10 (0.9)
32 (1.7)

12
7
7
3

26 (1.2)
19 (1.4)
12 (2.1)
8 (2.7)

28 (1.4)
21 (1.4)
13 (1.5)
15 (1.0)

Weighted sample size
(in millions)
Country of birth
Born in another country
Highest level of education
0 to 8 years
9 to 12 years
HS diploma or GED
Race/Ethnicity
White
Black
Hispanic
Asian/Pacific Islander
Age
16 to 24 years
65 years and older
Disability or condition
Any condition
Visual difficulty
Hearing difficulty
Learning disability
Notes: CPCT = column percentage; se = standard error.

SOURCE: U.S. Department of Education, National Center for Education Statistics, National
Adult Literacy Survey, 1992.

Adults who did not complete high school were also disproportionately
represented at the low end of the literacy scales. While 23 percent of the adult
population reported that they had not completed high school, 59 to 62 percent
of adults who performed in Level 1 on the three scales and 77 to 78 percent of
those in Level 1 with no tasks correct said they had not completed high school
or its equivalent.

Appendix A . . . . . . 129

Relatively high percentages of the respondents in Level 1 were Black,
Hispanic, or Asian/Pacific Islander. The largest group among those who did not
perform any tasks correctly were Hispanic. Hispanics and Asian/Pacific
Islanders are more likely than others to be recent immigrants with a limited
command of English.
Older adults were overrepresented in the Level 1 population as well as in
the population of adults who did not meet the demands of the Level 1 tasks.
While 16 percent of the total U.S. population was age 65 or older,
approximately one-third of the Level 1 population and 25 to 32 percent of the
adults in Level 1 who performed no literacy tasks correctly were in this age
group. In contrast, compared with their representation in the total U.S.
population (18 percent), younger adults were underrepresented in Level 1 (11
to 14 percent) and in the subgroup of Level 1 that did not succeed on any of
the literacy tasks (10 to 11 percent).
Disabilities are sometimes associated with low literacy performance.
While 12 percent of the adult population reported having a physical, mental, or
health condition that kept them from participating fully in work and other
activities, 26 percent of adults who performed in Level 1 and 22 to 28 percent
of those in Level 1 who did not succeed on any of the literacy tasks had such
conditions. Further, while only 3 percent of the U.S. population reported
having a learning disability, 8 to 9 percent of the adults who performed in Level
1 on the prose, document, and quantitative scales and 14 to 15 percent of those
in Level 1 who did not succeed on any task had this type of disability.
These results show that adults in some population groups were
disproportionately likely to perform in the lowest literacy level, and among
those who performed in this level, were disproportionately likely not to succeed
on any of the literacy tasks in the assessment.

130 . . . . . . Appendix A

APPENDIX B
Tables

NALS

Table B.1
Average Years of Schooling of Inmates, by Number of Times Recidivated

PRIOR SENTENCE

WGT N
(/1,000)

n

MEAN

STANDARD
ERROR

Probation
None
1 time
2 times
3 or more times

448
327
192
161

284
222
128
107

11.1
10.6
10.9
10.8

0.1
0.1
0.2
0.2

426
231
148
318

265
155
102
215

11.1
10.7
11.0
10.7

0.1
0.2
0.1
0.1

267
182
173
504

162
123
115
339

11.3
10.7
10.7
10.8

0.2
0.2
0.2
0.1

Incarceration
None
1 time
2 times
3 or more times

Probation and/or
Incarceration
None
1 time
2 times
3 or more times

Source: Educational Testing Service, National Adult Literacy Survey, 1992.

NALS

Table B.2
Percentages of Inmates Reporting Number of Times on Probation,
by Employment Status
EMPLOYMENT STATUS

NUMBER OF
TIMES OF
PROBATION
None
1
2
3 or more

Total

Employed
full-time

Employed
part-time

Looking
for work

Not
looking
for work

100.0
100.0
100.0
100.0

60.8
55.1
52.9
47.8

11.6
12.3
12.2
11.7

13.8
17.3
17.9
18.3

13.8
15.3
17.0
22.2

Source: U.S. Department of Justice, Bureau of Justice Statistics, Survey of Inmates in State Correctional Facilities, 1991.

Appendix B . . . . . . 131

APPENDIX C
Overview of Procedures Used in the
National Adult Literacy Survey

T

his appendix provides information about the methods and procedures used
in the National Adult Literacy Survey. The forthcoming technical report will
provide more extensive information about procedures. In addition, more
detailed information on the development of the background questionnaires and
literacy tasks can be found in Assessing Literacy.1

Sampling
The National and State Adult Literacy Surveys included the following three
components: a national household sample, 11 individual state household
samples, and a national prison sample. The national and state household
components were based on a four-stage stratified area sample with the
following stages: the selection of Primary Sampling Units (PSUs) consisting of
counties or groups of counties, the selection of segments consisting of census
blocks or groups of blocks, the selection of households, and the selection of
age-eligible individuals. One national area sample was drawn for the national
component; 11 independent, state-specific area samples were drawn for the 11
states participating in the state component (i.e., California, Illinois, Indiana,
Iowa, Louisiana, New Jersey, New York, Ohio, Pennsylvania, Texas,
Washington.) The sample designs used for all 12 samples were similar, except
for two principal differences. In the national sample, Black and Hispanic
respondents were sampled at a higher rate than the remainder of the
population in order to increase their representation in the sample, whereas the
state samples used no oversampling. Also, the target population for the national
sample consisted of adults 16 years of age or older, whereas the target
population for the state samples consisted of adults 16 to 64 years of age.
1

A. Campbell, I. Kirsch, and A. Kolstad. (1992). Assessing Literacy: The Framework for the National Adult
Literacy Survey. Washington, DC: Government Printing Office.

Appendix C . . . . . . 133

The sample designs for all 12 household samples involved four stages of
selection, each at a successively finer level of geographic detail. The first stage
of sampling involved the selection of PSUs, which consist of counties or groups
of counties. The PSUs were stratified on the basis of region, metropolitan
status, percent Black, percent Hispanic, and, whenever possible, per capita
income. The national component used the WESTAT 100 PSU master sample
with the Honolulu, Hawaii PSU added to the sample with certainty, to make
101 PSUs in total. The national frame of PSUs was used to construct individual
state frames for the state component and a sample of eight to 12 PSUs was
selected within each of the given states. All PSUs were selected with
probability proportional to the PSU’s 1990 population.
The second stage of sampling involved the selection of segments (within
the selected PSUs) which consist of census blocks or groups of census blocks.
The segments were selected with probability proportional to size where the
measure of size for a segment was a function of the number of year-round
housing units within the segment. The oversampling of Black and Hispanic
respondents for the national component was carried out at the segment level,
where segments were classified as high minority (segments with more than 25
percent Black or Hispanic population) or not high minority. The measure of
size for high minority segments was defined as the number of White nonHispanic households plus three times the number of Black or Hispanic
households. High minority segments were therefore oversampled at up to
three times the rate of comparable, non-highminority segments. The measure
of size for nonminority segments was simply the number of year-round housing
units within the segment, as was the measure of size for all segments in the
state components. One in 7 of the national component segments was selected
at random to be included in a “no incentive” sample. Respondents from the
remaining segments in the national component received a monetary incentive
for participation, as did respondents in the state component. (Respondents
from the “no incentive” segments are not included in the household sample of
this report.)
The third stage of sampling involved the selection of households within
the selected segments. Westat field staff visited all selected segments and
prepared lists of all housing units within the boundaries of each segment as
determined by the 1990 census block maps. The lists were used to construct
the sampling frame for households. Households were selected with equal
probability within each segment, except for White non-Hispanic households in
high minority segments in the national component, which were subsampled so
that the sampling rates for White non-Hispanic respondents would be about
the same overall.

134 . . . . . . Appendix C

The fourth stage of sampling involved the selection of one or two adults
within each selected household. A list of age-eligible household members (16
and older for the national component, 16 to 64 for the state component) was
constructed for each selected household. One person was selected at random
from households with fewer than four eligible members; two persons were
selected from households with four or more eligible members. The
interviewers, who were instructed to list the eligible household members in
descending order by age, then identified one or two household members to
interview, based on computer-generated sampling messages that were attached
to each questionnaire in advance.
The sample design for the prison component involved two stages of
selection. The first stage of sampling involved the selection of state or federal
correctional facilities with probability proportional to size, where the measure
of size for a given facility was equal to the inmate population. The second stage
involved the selection of inmates within each selected facility. Inmates were
selected with a probability inversely proportional to their facility’s inmate
population (up to a maximum of 22 interviews in a facility) so that the product
of the first and second stage probabilities would be constant.

Weighting
Full sample and replicate weights were calculated for each record in order to
facilitate the calculation of unbiased estimates and their standard errors.
The full sample and replicate weights for the household components were
calculated as the product of the base weight for a record and a compositing and
raking factor. Demographic variables critical to the weighting were recoded
and imputed, if necessary, prior to the calculation of base weights.
The base weight was calculated as the reciprocal of the final probability of
selection for a respondent, which reflected all stages of sampling. The base
weight was then multiplied by a compositing factor which combined the
national and state component data in an optimal manner, considering the
differences in sample design, sample size, and sampling error between the two
components. Twelve different compositing factors were used, one for each of
the 11 participating states, and a pseudo factor (equal to one) for all national
component records from outside the 11 participating states. The product of the
base weight and compositing factor for a given record was the composite
weight.
The composite weights were raked so that several totals calculated with
the resulting full sample weights would agree with the 1990 census totals,
adjusted for undercount. The cells used for the raking were defined to the

Appendix C . . . . . . 135

finest combination of age, education level, race, and ethnicity that the data
would allow. Raking adjustment factors were calculated separately for each of
the 11 states and then for the remainder of the United States. The above
procedures were repeated for 60 strategically constructed subsets of the
sample to create a set of replicate weights to be used for variance estimation
using the jackknife method. The replication scheme was designed to produce
stable estimates of standard errors for national estimates as well as for the 11
individual states.
The full sample and replicate weights for the incarcerated component
were calculated as the product of the base weight for a record and a
nonresponse and raking factor. The base weight was calculated as the
reciprocal of the final probability of selection for a respondent, which reflected
both stages of sampling. The base weights were then nonresponse adjusted to
reflect both facility and inmate nonresponse. The resulting nonresponse
adjusted weights were then raked to agree with independent estimates for
certain subgroups of the population.

Background Questionnaires
One of the primary goals of the National Adult Literacy Survey is to relate the
literacy skills of the nation’s adults to a variety of demographic characteristics
and explanatory variables. Accordingly, survey respondents were asked to
complete background questionnaires designed to gather information on their
characteristics and experiences. To ensure standardized administration, the
questionnaires were read to the respondent by trained interviewers.
As recommended by the Literacy Definition Committee, the development
of the background questionnaire was guided by two goals: to ensure the
usefulness of the data by addressing issues of concern, and to ensure
comparability with the young adult and Department of Labor (DOL) jobseeker surveys by including some of the same questions. With these goals in
mind, the background questionnaire addressed the following areas:
• general and language background
• educational background and experiences
• political and social participation
• labor force participation
• literacy activities and collaboration
• demographic information

136 . . . . . . Appendix C

Questions in the first category asked survey participants to provide
information on their country of birth, their education before coming to the
United States, language(s) spoken by others at home, language(s) spoken while
growing up, language(s) spoken now, participation in English as a Second
Language courses, and self-evaluated proficiency in English and other
languages. This information makes it possible to interpret the performance
results in light of the increasing racial/ethnic and cultural diversity in the
United States.
The questions on educational background and experiences asked
respondents to provide information on the highest grade or level of education
they had completed; their reasons for not completing high school; whether or
not they had completed a high school equivalency program; their educational
aspirations; the types and duration of training they had received in addition to
traditional schooling; the school, home, or work contexts in which they learned
various literacy skills; and any physical, mental, or health conditions they have
that may affect their literacy skills. Information on respondents’ education is
particularly important because level of education is known to be a predictor of
performance on the prose, document, and quantitative literacy scales.
The questions on political and social participation asked participants about
the sources from which they get information, their television viewing practices,
their use of library services, and whether or not they had voted in a recent
election. Because an informed citizenry is essential to the democratic process,
information was collected on how adults keep abreast of current events and
public affairs. Information on adults’ use of library services is also important,
because libraries promote reading and often provide literacy programs. These
questions make it possible to explore connections between adults’ activities and
their demonstrated literacy proficiencies.
The questions on labor force participation asked participants to provide
information on their employment status, weekly wages or salary, weeks of
employment in the past year, annual earnings, and the industry or occupation
in which they work(ed). These questions respond to concerns that the literacy
skills of our present and future work force are inadequate to compete in the
global economy or to cope with our increasingly technological society. The
questions were based on labor force concepts widely used in economic surveys
and permit the exploration of a variety of labor market activity and experience
variables.
Questions on literacy activities and collaboration covered several
important areas. Some of the questions focused on the types of materials that
adults read, such as newspapers, magazines, books, and brief documents,
making it possible to investigate the relationship between reading practices and
demonstrated literacy proficiencies. Another set of questions asked
Appendix C . . . . . . 137

respondents about the frequency of particular reading, writing, and
mathematics activities. Respondents were asked to provide information on
their newspaper, magazine, and book reading practices; reading, writing, and
mathematics activities engaged in for personal use and for work; and assistance
received from others with particular literacy tasks.
Finally, the survey collected information on respondents’ race/ethnicity,
age, and gender, as well as the educational attainment of their parents, their
marital status, the number of people in their family who were employed fulltime and part-time, sources of income other than employment, and family and
personal income from all sources. This demographic information enabled
researchers to analyze the characteristics of the adult population, as well as to
investigate the literacy proficiencies of major subpopulations of interest, such
as racial/ethnic groups, males and females, and various age cohorts.
Because some questions included in the household survey were
inappropriate for the prison population, a revised version of the background
questionnaire was developed for these respondents. Most of the questions in
the household background questionnaire on general and language background
and on literacy activities and collaboration were included. Many questions
concerning education, political and social participation, labor force
participation, family income, and employment status were not appropriate,
however, and were omitted. In their place, relevant questions were
incorporated from the 1991 Survey of Inmates of State Correctional Facilities,
sponsored by the Bureau of Justice Statistics of the U.S. Department of Justice.
As a result of these changes, the questionnaire for the prison population
addressed the following topics:
• general and language background
• educational background and experiences
• current offenses and criminal history
• prison work assignments and labor force participation
• literacy activities and collaboration
• demographic information
The information collected through these questions makes it possible, for the
first time, to explore complex relationships between prisoners’ literacy skills
and their experiences and characteristics.

138 . . . . . . Appendix C

Literacy Assessment Booklets
The National Adult Literacy Survey measures literacy along three scales —
prose, document, and quantitative — composed of literacy tasks that simulate
the types of demands that adults encounter in everyday life. The literacy tasks
administered in this survey included 81 new tasks as well as 85 tasks that were
included in the previous young adult and job-seeker surveys. The
administration of a common pool of tasks in each of the three surveys allows for
valid comparisons of results across time for different populations.
The new literacy tasks developed for the survey serve to refine and extend
the three existing literacy scales and provide a better balance of tasks across the
three scales. The framework used to develop these tasks reflects research on
the processes and strategies that respondents used to perform the literacy tasks
administered in the young adult survey. In creating the new tasks, one goal was
to include diverse stimulus materials and to create questions and directives that
represent the broad range of skills and processes inherent in the three domains
of literacy. Another goal was to create tasks that reflect the kinds of reading,
writing, and computational demands that adults encounter in work,
community, and home settings. Because the tasks are meant to simulate reallife literacy activities, they are open-ended — that is, individuals must produce
a written or oral response, rather than simply choose the correct response from
a list of options.
The new literacy tasks were developed with attention to the following
elements:
y the structure of the stimulus material — for example, exposition,
narrative, table, graph, map, or advertisement
y the content represented and/or the context from which the
stimulus is drawn — for example, work, home, or community
y

the nature of what the individual is asked to do with the material
— that is, the purpose for using the material — which in turn
guides the strategies needed to complete the task successfully

These factors, operating in various combinations, affect the difficulty of a task
relative to others administered in the survey.
The printed and written materials selected for the survey reflect a variety
of structures and formats. Most of the prose materials are expository — that is,
they describe, define, or inform — since most of the prose that adults read is
expository; however, narratives and poetry are included as well. The prose
selections include an array of linguistic structures, ranging from texts that are
highly organized both topically and visually, to those that are loosely organized.

Appendix C . . . . . . 139

Texts of varying lengths were chosen, ranging from full-page magazine
selections to short newspaper articles. All prose materials included in the
survey were reproduced in their original format.
The document materials represent a wide variety of structures, including
tables, charts and graphs, forms, and maps. Tables include matrix documents in
which information is arrayed in rows and columns (for example, bus or airplane
schedules, lists, or tables of numbers). Documents categorized as charts and
graphs include pie charts, bar graphs, and line graphs. Forms are documents
that must be filled in, while other structures include advertisements and
coupons.
Quantitative tasks require the reader to perform arithmetic operations
using numbers that are embedded in print. Since there are no materials that
are unique to quantitative tasks, they were based on prose materials and
documents. Most quantitative tasks were, in fact, based on documents.
Adults do not read printed or written materials in a vacuum. Rather, they
read within a particular context or for a particular purpose. Accordingly, the
survey materials were chosen to represent a variety of contexts and contents.
Six such areas were identified: home and family, health and safety, community
and citizenship, consumer economics, work, and leisure and recreation. Efforts
were made to include as broad a range as possible and to select universally
relevant contexts and contents to ensure that the materials would be familiar to
all participants. In this way, the disadvantages for individuals with limited
background knowledge were minimized.
After the materials were selected, accompanying tasks were developed.
The tasks were designed to simulate the way in which people use various types
of materials and to require different strategies for successful performance. For
both the prose and document scales, the tasks can be organized into three
major categories: locating, integrating, and generating information. In the
locating tasks, readers were asked to match information given in a question or
directive with either literal or synonymous information in the text or document.
Integrating tasks asked the reader to incorporate two or more pieces of
information from different parts of the text or document. Generating tasks
required readers not only to process information located in different parts of
the material, but also to draw on their knowledge about a subject or to make
broad, text-based inferences.
Quantitative tasks required readers to perform one or more arithmetic
operations (addition, subtraction, multiplication, or division) either singly or in
combination. The type of operation to be performed was sometimes obvious
from the wording of the question; in other tasks the readers had to infer which
operation was to be performed. In some cases the numbers required to
perform the operation could be easily identified; in others they were
140 . . . . . . Appendix C

embedded in text. Some quantitative tasks asked the reader to explain how he
or she would solve a problem, rather than to perform the actual calculation.
The use of a simple, four-function calculator was required for some tasks.

Survey Design: BIB Spiralling
No individual could be expected to respond to the entire set of 166 simulation
tasks administered as part of the survey. Accordingly, the survey design gave
each respondent a subset of the total pool of literacy tasks, while at the same
time ensuring that each of the 166 tasks was administered to a nationally
representative sample of the adult population. Literacy tasks were assigned to
blocks or sections that could be completed in about 15 minutes, and these
blocks were then compiled into booklets so that each block appeared in each
position (first, middle, and last) and each block was paired with every other
block. Thirteen blocks of simulation tasks were assembled into 26 booklets,
each of which could be completed in about 45 minutes. During a personal
interview, each participant was asked to complete one booklet of literacy tasks
and the background questionnaire, which required approximately 20 minutes.

Training the Data Collection Staff
For the national and state samples, 24 field supervisors, 24 field editors, and
421 field interviewers were recruited and trained in January and February of
1992. The 24 supervisors were trained first at a session in Bethesda, Maryland.
The seven-day program included the interviewer training. Additionally, Westat
provided training specific to supervisory responsibilities, including the use of
Westat’s Automated Survey Control System, a computer-based system for
managing the data collection effort. Finally, supervisors and editors were
trained to perform an item-by-item edit for each data collection instrument
received from the field interviewers.
After the training offered in Bethesda, interviewers attended training
sessions geographically closest to their homes, either San Francisco (January
31- February 2) or Dallas (February 7-9). Four training groups were formed at
each of the two training sites. Each group was led by a Westat home office field
manager. Within each of the four groups, the trainees were divided into
“learning communities” with approximately 18 interviewers each. Each
community was led by the field supervisor who would supervise the
interviewers during the data collection phase.
The training program was modeled closely after Westat’s general approach
for training field staff. This approach uses a mix of techniques to present study

Appendix C . . . . . . 141

material, focusing heavily on trainee participation and practice. The training
program was standardized with verbatim scripts and a detailed agenda to
ensure comparability in presentation across groups.
The key training topics were the data collection instruments — the
household screener, the background questionnaire, and the interview guide
and literacy exercise booklet. The majority of training time was devoted to
instructions for administering these documents. In addition, sessions were used
to present instructional material on gaining respondent cooperation, keeping
records of nonresponse cases, editing completed work, and completing
administrative forms. A bilingual field supervisor provided Spanish speaking
interviewers with training on the Spanish translations of the screener and
background questionnaires.
Prior to project-specific training, new interviewers attended an additional
one-half day of training on general interviewing techniques. Interviewers
selected to work on the prison sample received an additional day of training on
interview procedures unique to that sample.

Administering the Data Collection Instruments
Data collection instruments included the screener, which was designed to
enumerate household members and select survey respondents, the background
questionnaire, and the literacy exercise booklets. Interviewers were given their
first assignments and began work immediately after training. The interviewer
was given a call record folder and screener for each sampled dwelling unit in
his or her assignment. A computer-generated label attached to the front of
each folder and screener provided the case identification number, address, and
assigned exercise booklet number. Additionally, interviewers were provided
with all other field materials necessary to conduct interviews and meet
reporting requirements.
Case assignments were made by the field supervisors, who also mailed
letters to households about one week before the interviewers planned to
contact the household. When making contact, the interviewer first verified that
the address was in the sample and the unit was, in fact, an occupied dwelling. If
the unit did not meet the definition of a year-round housing unit or was vacant,
or for some other reason the interviewer was unable to complete a screener at
an assigned address, she or he documented the situation in a noninterview
report form.
The interviewer introduced the study using an introduction printed on the
front of the screener. As part of the introduction, the interviewer indicated that
if someone from the household was selected for an interview, the respondent

142 . . . . . . Appendix C

would be paid $20 for participating. After introducing the study, the
interviewer proceeded to conduct the screening interview with any household
member 16 years of age or older. If the household members spoke only a
language other than Spanish or English, the interviewer could obtain the
services of a translator to complete the screener interview.
The screener was used to collect names, relationships, sex, age and race/
ethnicity of all household members at the selected dwelling unit. For the
national sample, household members aged 16 years and older were eligible for
selection. For the state sample, however, household members 16 to 64 years of
age were eligible. In households with three or fewer eligible household
members, one was randomly selected for the interview. In households with
four or more eligibles, two respondents were selected. To select respondents,
interviewers first listed the names and ages (in descending age order) of all
eligible household members. They then referred to a sampling table which
selected one or two respondents from the household.
Once the Screener was completed and a respondent(s) selected, the
interviewer proceeded to administer the background questionnaire and the
exercise booklet. If the selected respondent was not available at the time the
screener was conducted, the interviewer returned to administer the
background questionnaire and exercise booklet, which were administered on
the same visit.
The background questionnaire took approximately 20 minutes to
administer and could be conducted in English or Spanish (using the Spanish
printed version) only. In the introduction to the background questionnaire, the
respondent was told that he or she would be given a check for $20 in
appreciation of the time and effort involved in completing the interview,
questionnaires, and assessment. The background questionnaire was divided
into six sections and collected demographic data as well as data on literacyrelated behaviors. Respondents from each of the 11 participating states were
asked five state-specific questions, which appeared at the end of the
questionnaire.
When the background questionnaire was completed, the interviewer
administered the exercise booklet, which took approximately 45 minutes.
There were 26 different versions of the exercise booklet, and each version had
a corresponding interview guide, which the interviewer used to facilitate the
respondent’s completion of tasks in the booklet.
For the prison population, the interviewer informed the selected inmate
about the study using an introduction printed in the background questionnaire
since there was no screener. As part of the introduction, the interviewer
indicated that the inmate would receive a certificate of participation if he or

Appendix C . . . . . . 143

she completed the survey. Because of varying prison regulations, it was not
possible to pay inmates $20 for their participation and so they received the
certificate. The background questionnaire and exercise booklet were
administered using the same procedures as for the household population.

Response Rates
Since there were three instruments — screener, background questionnaire,
and exercise booklet — required for the administration of the survey, it was
possible for a household or respondent to refuse to participate at the time of
the administration of any one of these instruments. Thus, response rates were
calculated for each of the three instruments. For the prison sample there were
only two points at which a respondent could refuse — at the administration of
either the background questionnaire or exercise booklet. The response rates
presented below reflect the percentage of those who had the opportunity to
participate at each stage of the survey. The response rates for the national
household and prison samples are as follows.

Instrument
Screener
Background Questionnaire
Exercise Booklet

Response Rates
National
Prison
89.1%
N/A
81.0%
85.7%
95.8%
96.1%

Data Collection Quality Control
Several quality control procedures relating to data collection were used. These
included the interviewer field edit, a complete edit of all documents by a
trained field editor, validation of 10 percent of each interviewer’s close-out
work, and field observation of both supervisors and interviewers.
At the interviewer training session, interviewers were instructed on
procedures for performing a field edit of all data collection documents. The
main purpose of this edit was to catch and correct or explain any errors or
omissions in recording, to learn from mistakes so they were not repeated, and
to remove stray marks and completely fill in bubbles on the documents that
were to be optically scanned.
Additionally, a complete edit was performed on all documents by a trained
field editor. An item-by-item review was performed on each document, and
each error was fully documented on an edit form. The supervisor reviewed the

144 . . . . . . Appendix C

results of the edit with the interviewer during his or her weekly telephone
conference.
Validation is the quality control procedure used to verify that an interview
was conducted and it took place at the correct address and according to
specified procedures, or that nonresponse statuses (e.g., refusals, vacancies,
language problems) were accurately reported by the interviewers. Interviewers
knew that their work would be validated but did not know to what extent or
which cases. A 10 percent subsample of dwelling units were selected and
flagged in the supervisor’s log and in the automated survey control system
(ASCS). The supervisors performed validation interviews by telephone if a
phone number was available. Otherwise, validation was performed in person by
the supervisor or by another interviewer.
Field observations of both supervisors and interviewers were performed
by Westat field management staff. One purpose of the interviewer observation
was to provide home office staff with an opportunity to observe effectively both
performance of field procedures and respondents’ reactions to the survey.
Another purpose was to provide feedback to weak interviewers when there was
concern about their skills and/or performance. In addition to in-person
observations, interviewers were required to tape record one complete
interview and assessment. The field supervisor selected the particular case in
advance and listened to the tape to “observe” each interviewer.
Finally, nine of the 24 supervisors were visited by field management staff
and evaluated on their editing, coding, office organization, ability to maintain
up-to-date records on production data, and supervision of interviewers.

Scoring the Literacy Exercise Booklets
As the first shipments of exercise booklets were received at ETS, copies were
made of actual responses to the tasks. These sample responses were then
scored by various staff, including the test developer and scoring supervisor,
using either the scoring guides developed for the young adult tasks or guides
prepared during the development of the new tasks. As the sample responses
were scored, adjustments were made to the scoring guides for the new tasks to
reflect the kinds of answers that the respondents were providing.
The sample papers comprised the training sets used to train a group of
readers who would score the exercise booklets. The purposes of the training
were to familiarize the readers with the scoring guides and to ensure a high
level of agreement among the readers. Each task and its scoring guide were
explained and sample responses representative of the score points in the guide
were discussed. The readers then scored and discussed an additional 10 to 30

Appendix C . . . . . . 145

responses. After group training had been completed, all the readers scored all
the tasks in over a hundred booklets to give them practice in scoring actual
booklets, as well as an opportunity to score more responses on a practice basis.
A follow-up session was then held to discuss responses on which readers
disagreed. The entire training process was completed in about four weeks.
Twenty percent of all the exercise booklets were subjected to a reader
reliability check, which entailed a scoring by a second reader. To prevent the
second reader from being influenced by the first reader’s scores, the first
reader masked the scores in every fifth booklet that he or she scored. These
booklets were then passed on for a second reader to score. When the second
reader had scored every item, the first reader’s scores were unmasked. If there
was a discrepancy between the two scores for any response, the scoring
supervisor reviewed the response and discussed it with the readers involved.
The statistic used to report inter-reader reliability is the percentage of
exact agreement — that is, the percentage of times the two readers agreed
exactly in their scores. There was a high degree of reader reliability across all
the tasks in the survey, ranging from a low of 88.1 percent to a high of 99.9
percent with an average agreement of 97 percent. For 133 out of 166 openended tasks, the agreement was above 95 percent.

Data Entry
The background questionnaire was designed to be read by a computerized
scanning device. For most questions, field personnel filled in ovals next to the
respondent’s answers. Open-ended items in the background questionnaire
were coded and the ovals filled in by ETS staff before they were shipped to the
scanning department. Responses on the screener were transferred to scannable
documents by ETS personnel when the check-in process was complete, and
the screener documents were batched and sent to the scanning department on
a regular basis. Exercise booklet scores were transferred to scannable
documents by the readers who scored the items, and these were also batched
and sent to the scanning department at regular intervals. The scanned data
from screeners, background questionnaires, and exercise booklets were
transmitted to magnetic tape, which was then sent to the ETS computer center.
As each of the different instruments were processed, the data were transferred
to a database on the main computer for editing.

146 . . . . . . Appendix C

Editing and Quality Control
Editing included an assessment of the internal logic and consistency of the data
received. For example, data were examined for nonexistent housing locations
or booklets, illogical or inconsistent responses, and multiple responses. Where
indicated, an error listing was generated and sent back to the processing area,
where the original document was retrieved and the discrepancies were
corrected. If resolution of a conflict in the data was not possible, the
information was left in the form in which it was received. Wherever possible,
however, conflicts were resolved. For example, in the infrequent cases in which
field personnel provided more than one response to a single-response
noncognitive item, specific guidelines were developed to incorporate these
responses consistently and accurately. The background questionnaires were
also checked to make sure that the skip patterns had been followed and all data
errors were resolved. In addition, a random set of booklets was selected to
provide an additional check on the accuracy of transferring information from
booklets and answer sheets to the database.

Scaling
The results from the National Adult Literacy Survey are reported on three
scales established by the NAEP 1985 Young Adult Literacy Survey: prose
literacy, document literacy, and quantitative literacy. With scaling methods, the
performance of a sample of examinees can be summarized on a series of
subscales even when different respondents have been administered different
items. Conventional scoring methods are not suited for assessments like the
national survey. Statistics based on the number of correct responses, such as
proportion of correct responses, are inappropriate for examinees who receive
different sets of items. Moreover, item-by-item reporting ignores similarities of
subgroup comparisons that are common across items. Finally, using average
percent correct to estimate means of proficiencies of examinees within
subpopulations does not provide any other information about the distribution
of skills among the examinees.
The limitations of conventional scoring methods can be overcome by the
use of item response theory (IRT) scaling. When several items require similar
skills, the response patterns should have some uniformity. Such uniformity can
be used to characterize both examinees and items in terms of a common scale
attached to the skills, even when all examinees do not take identical sets of
items. Comparisons of items and examinees can then be made in reference to a
scale, rather than to percent correct. IRT scaling also allows distributions of
groups of examinees to be compared.

Appendix C . . . . . . 147

Scaling was carried out separately for each of the three domains of literacy
(prose, document, and quantitative). The NAEP reading scale, used in the
young adult survey, was dropped because of its lack of relevance to the current
NAEP reading scale. The scaling model used for the national survey is the
three-parameter logistic (3PL) model from item response theory.2 It is a
mathematical model for estimating the probability that a particular person will
respond correctly to a particular item from a single domain of items. This
probability is given as a function of a parameter characterizing the proficiency
of that person, and three parameters characterizing the properties of that item.

Overview of Linking the National Adult Literacy Survey (NALS)
Scales to the Young Adult Literacy Survey (YALS) Scales
Prose, document, and quantitative literacy results for the National Adult
Literacy Survey are reported on scales that were established in the Young Adult
Literacy Survey. For each scale, a number of new items unique to the national
survey were added to the item pool that was administered in the original young
adult survey. The NALS scales are linked to the YALS scales based upon the
commonality of the two assessments, namely, the original young adult survey
common items. Fifty-one percent of the items administered in the national
survey were common to young adult survey. The composition of the item pool
is presented in table C.1.

NALS

Table C.1
Composition of the Item Pool for the National Adult Literacy Survey

Number of Items
SCALE

YALS items

New items

NALS
total

Prose

14

27

41

Document

56

25

81

Quantitative

15

28

43

Total

85

81

165

Source: Educational Testing Service, National Adult Literacy Survey, 1992.

2

A. Birnbaum. (1968). “Some Latent Trait Models.” In F.M. Lord and M.R. Novick, Statistical Theories of
Mental Test Scores. Reading, MA: Addison-Wesley. F.M. Lord. (1980). Applications of Item Response
Theory to Practical Testing Problems. Hillsdale, NJ: Erlbaum.

148 . . . . . . Appendix C

A unidimensional IRT model like the three-parameter logistic model
employed in this study assumes that performance on all the items in a domain
can, for the most part, be accounted for by a single (unobservable) proficiency
variable. Subsequent IRT linking and scaling analyses treat each scale
separately, that is, a unique proficiency is assumed for each scale. As a result,
the linking of corresponding scales was carried out for each pair of scales
separately. The three steps used to link the scales are listed below.
1.

Establish provisional IRT scales through common item parameter
calibration based on a pooling of the NALS and YALS items.

2.

Estimate distribution of proficiencies on the provisional IRT scales using
“plausible value” methodology.

3.

Align the NALS scale to the YALS scale by a linear transformation based
upon the commonality of proficiency distribution of the YALS sample.

Statistical Procedures
The statistical comparisons in this report were based on the t statistic.
Generally, whether or not a difference is considered significant is determined
by calculating a t value for the difference between a pair of means, or
proportions, and comparing this value to published tables of values at certain
critical levels, called alpha levels. The alpha level is an a priori statement of the
probability of inferring that a difference exists when, in fact, it does not.
In order to make proper inferences and interpretations from the statistics,
several points must be kept in mind. First, comparisons resulting in large t
statistics may appear to merit special note. This is not always the case, because
the size of the t statistic depends not only on the observed differences in means
or the percentage being compared, but also on the standard error of the
difference. Thus, a small difference between two groups with a much smaller
standard error could result in a large t statistic, but this small difference is not
necessarily noteworthy. Second, when multiple statistical comparisons are
made on the same data, it becomes increasingly likely that an indication of a
population difference is erroneous. Even when there is no difference in the
population, at an alpha level of .05, there is still a 5 percent chance of
concluding that an observed t value representing one comparison in the sample
is large enough to be statistically significant. As the number of comparisons
increases, the risk of making such an error in inference also increases.
To guard against errors of inference based upon multiple comparisons, the
Bonferroni procedure to correct significance tests for multiple contrasts was
used. This method corrects the significance (or alpha) level for the total
number of contrasts made with a particular classification variable. For

Appendix C . . . . . . 149

each classification variable, there are (K•(K–1))/2) possible contrasts (or
nonredundant pairwise comparisons), where K is the number of categories.
The Bonferroni procedure divides the alpha level for a single t test (for
example, .05) by the number of possible pairwise comparisons in order to give
a new alpha that is corrected for the fact that multiple contrasts are being
made.
The formula used to compute the t statistic is as follows:
t=

P1–P2
√se12 + se22

where P1 and P2 are the estimates to be compared and se1 and se2 are their
corresponding standard errors.

150 . . . . . . Appendix C

APPENDIX D
Definitions of All Subpopulations
and Variables Reported
(In Order of Appearance)

Prison Population
The prison sample includes only those individuals who were in state or federal
prisons at the time of the survey. Those held in local jails, community-based
facilities, or other types of institutions were not surveyed.
Household Population
The household population includes adults aged 16 and older who participated
in the national household survey and the state surveys.
Highest Level of Education Completed
Respondents were asked to indicate the highest level of education they
completed in this country. The following options were given:
Still in high school (not applicable to the prison population)
Less than high school
Some high school
GED or high school equivalency
High school graduate
Vocational, trade, or business school after high school
College, less than 2 years
College, associate’s degree (A.A.)
College, 2 or more years, no degree
College graduate (B.S. or B.A.)
Postgraduate, no degree
Postgraduate degree (M.S., M.A., Ph.D., M.D., etc.)
In most tables, less than high school and some high school were collapsed into
one group (0 to 12 years); GED recipients and high school graduates were
collapsed into another group; and vocational, college, and postgraduate
categories were collapsed into a postsecondary group.

Appendix D . . . . . . 151

In other tables, less than high school (0 to 8 years), some high school (9 to
12 years), GED, and high school diploma were presented as separate groups,
with the postsecondary group collapsed as described above.
A third education variable was derived by collapsing less than high school
and some high school into one category and all other categories into a high
school diploma, GED, or higher group.
Finally, a fourth grouping categorizes education level as less than high
school (0 to 8 years), some high school (9 to 12 years), GED, high school
diploma, and two postsecondary groups — those who had not received a
degree (some postsecondary) and those who had (postsecondary degree).
Average Years of Schooling
Responses to the question on the highest level of education completed were
used to calculate the average number of years of schooling completed. For the
household population, individuals who were still in high school were left out of
this analysis. Adults who had not graduated from high school were asked to
indicate exactly how many years of schooling they had completed (0 through
11). Individuals who did not provide this information were assigned a value
equal to the average number of years of schooling completed by those who did
provide this information. For adults in the category of 0 to 8 years of education,
the average number of years of schooling was 6.10. For adults in the category
of 9 to 12 years of education, the average number of years of schooling was
10.11. The remaining adults were assigned values representing the number of
years of schooling completed, as follows:
GED, high school equivalency
High school graduate
Vocational, trade, or business school
College, less than 2 years
College, associate’s degree (A.A.)
College, 2 or more years, no degree
College graduate (B.S. or B.A.)
Postgraduate, no degree
Postgraduate degree

12
12
13
13
14
14.5
16
17
18

Using these values, the average number of years of schooling was calculated for
race/ethnicity.
Race/Ethnicity
Respondents were asked two questions about their race and ethnicity. One
question asked them to indicate which of the following terms best described
them. The interviewer recorded from observation the races of respondents who
refused to answer the question.

152 . . . . . . Appendix D

White
Black (African American)
American Indian
Alaskan Native

Pacific Islander
Asian
Other

The other question asked respondents to indicate whether they were of
Spanish or Hispanic origin or descent. Those who responded “yes” were asked
to identify which of the following groups best describes their Hispanic origin:
Mexicano, Mexican, Mexican American, Chicano
Puerto Rican
Cuban
Central/South American
Other Spanish/Hispanic
All those who indicated they were of Spanish or Hispanic origin were
grouped together, regardless of their origin. Adults of Pacific Islander origin
were grouped with those of Asian origin, and Alaskan Natives were grouped
with American Indians. All other racial/ethnic groups are reported separately.
In most analyses, however, results are reported only for the White, Black, and
Hispanic subpopulations because the sample sizes of the other groups were too
small to provide reliable estimates.
A second race/ethnicity variable was derived by combining the Hispanic
group with all others.
Sex
The interviewers recorded the sex of each respondent.
Age
Respondents were asked to report their date of birth, and this information was
used to calculate their age. In most analyses, ages were then grouped as
follows: 16 to 24, 25 to 34, and 35 and older. Another grouping presents age as
follows: 16 to 18, 19 to 24, 25 to 34, 35 to 54, 55 to 64, 64 and older.
Presence and Type of Physical, Mental, or Other Health Condition
Respondents were asked a series of questions in which they were asked to
identify whether they had any of the following:
a physical, mental, or other health condition that keeps them from
participating fully in work, school, or other activities
difficulty seeing the words or letters in ordinary newspaper print even
when wearing glasses or contact lenses, if they usually wear them

Appendix D . . . . . . 153

difficulty hearing what is said in a normal conversation with another
person even when using a hearing aid, if they usually wear one
a learning disability
any mental or emotional condition
mental retardation
a speech disability
a physical disability
a long-term illness (6 months or more)
any other health impairment
Respondents were able to indicate each physical, mental, or health condition
they had; thus, these categories are not mutually exclusive. From this series of
questions one variable was defined as one or more disabilities and no disability.
Data are also reported by each of the specific disabilities.
Reason for Dropping Out of School
Respondents who reported that they had less than high school, some high
school, or a GED were asked to indicate the main reason for dropping out of
school. They were asked to choose from the following reasons:
financial problems
went to work or into the military
pregnancy
lost interest or behavior problems in school
academic problems in school
family or personal problems
convicted of crime or sent to jail/prison/detention center
other
The categories of lost interest and academic problems were grouped together
and the category of pregnancy was grouped with other because of small sample
sizes.
Level of Parental Education
Respondents were asked to indicate the highest level of education completed
by their mother (or stepmother or female guardian) and by their father (or
stepfather or male guardian). The analyses in this report are based on the
highest level of education attained by either parent.
Language Spoken in the Home While Growing Up
Respondents were asked what language or languages were spoken in their
home while growing up. Three categories were then derived: English only,

154 . . . . . . Appendix D

English and any other language, and any language or languages other than
English.
Occupation
Inmates who indicated that they had worked within the last three years while
not incarcerated were asked two questions about their occupation. The first
question asked them to indicate their occupation or the name of their job —
for example, electrical engineer, stock clerk, typist, or farmer. The second
question asked them to describe the most important activities or duties of the
job. Responses were coded according to the Bureau of Census occupation
codes. These codes were then collapsed into four main categories: professional,
sales or administrative, craft or service, and assembly, labor, farm, or
transportation.
Monthly Income
Inmates who indicated they had worked within the last three years were also
asked to indicate their average monthly earnings before deductions. Responses
were grouped into four categories: 0 to $499, $500 to $999, $1,000 to $1,499,
and $1,500 or more.
Current Offense
Inmates were asked for what offenses they were currently in prison. If they
indicated more than one, they were asked to indicate for which offense they
had received the longest sentence. That offense was the one used to group
inmates by offense if they indicated more than one. The reporting categories
are comprised as follows:
violent: homicide, rape, sexual assault, robbery, kidnapping, or assault
property: burglary, larceny, auto theft, fraud, embezzlement, forgery,
arson, stolen property, trespassing, hit and run, and other property
drugs: trafficking, possession or use, and other drug related
public order and other: weapons offense, rioting, contempt of court,
morals/decency offense, probation and parole violations, minor traffic
violations
Federal offenses were not included.
Length of Sentence
This variable was derived from questions that asked for date of admission to
prison and date of expected release. The categories are as follows: 1 to 60
months, 61 to 120 months, 121 or more months, do not expect to be released,
and don’t know.

Appendix D . . . . . . 155

Participation in Education and Vocational Training Programs
Inmates were asked two questions about their participation in education and
vocational training programs. Four categories were derived: no participation in
either, participation in education classes only, in vocational classes only, and in
both kinds of classes.
Involvement in Prison Work Assignments
Inmates indicated whether or not they currently had any work assignments
whether inside or outside of prison.
Type of Work Assignment
Inmates who indicated that they had a work assignment were asked to indicate
what assignments they had. The options included the following:
goods production
general janitorial duties
ground or road maintenance
food preparation or related duties
laundry
hospital, infirmary, or other medical services
farming, forestry, or ranching
other services, e.g., library, stockroom, store
maintenance, repair, or construction
enrolled in school
other
Laundry, hospital, and farming were collapsed into the other group because
their sample sizes were too small to provide reliable estimates.
Participation in Groups
Inmates indicated whether or not they had ever joined any organization
authorized by prison authorities.
Type of Groups Joined
Inmates who indicated that they had joined a group were asked to indicate
what groups they had joined. The options from which they had to choose were
grouped as follows:
addiction: drug awareness or dependency and alcohol-related groups
religious: religious study group and religious activities
life skills: other prisoner self-help or personal improvement group, classes
in parenting, classes in life skills, and arts and crafts

156 . . . . . . Appendix D

ethnic or racial organization
prisoner assistance group
outside community activities
prerelease programs
other
Number of Groups Joined
Because the question regarding what groups were joined allowed multiple
responses, a variable was derived based on the number of different groups the
inmates indicated in response to that question.
Recidivism
Recidivism was defined as a prior sentence to probation and/or incarceration.
The data were derived from two series of questions. One series asked inmates
if they had been on probation and if so, how many times as a juvenile and as an
adult. Another series asked if they had ever served time in prison and if so, how
many times as a juvenile and as an adult. Data are reported by two different
sets of variables. The first set presents data by the variables of probation
(regardless of whether or not they had been incarcerated), incarceration
(regardless of whether or not they had been on probation), and probation and/
or incarceration. These variables were further broken down by the categories
of none (for example, no probation), juvenile only, adult only, and both juvenile
and adult. The second set of variables presents data by probation only,
incarceration only, and both probation and incarceration.
Number of Prior Sentences
The variable of number of prior sentences was derived from the same series of
questions that was used to define prior sentence. The number of prior
sentences was defined for probation, incarceration, and probation and/or
incarceration and included the categories of none, one time, two times, and
three or more times.
Reading, Writing, and Arithmetic Practices
Inmates were asked a series of questions about how often they read the
following materials in English:
letters or memos
reports, articles, magazines, or journals
manuals or reference books, including catalogs or parts lists
directions or instructions for medicines, recipes, or other products
diagrams or schematics
bills, invoices, spreadsheets, or budget tables

Appendix D . . . . . . 157

They were asked another series of questions about how often they wrote or
filled out letters or memos, forms, and reports and articles, and one question
about how often they used arithmetic. The frequency categories for all these
questions were every day, a few times a week, once a week, less than once a
week, and never. These categories were collapsed into three: every day or a few
times a week, once a week, and less than once a week including never.
The household population was asked the same series of questions except
that they were asked to report how often they read, wrote, or used the
materials for their personal use and on the job. The data for personal and job
use were collapsed so as to report overall frequency for each of the materials
listed.
Types of Books Read
Respondents were asked to indicate which of the following types of books they
had read in English in the past six months:
fiction
recreation or entertainment
current affairs or history
inspiration or religion
science or social science
reference, such as encyclopedias or dictionaries
manuals for cooking, operating, repairing, or building
any other types of books
Respondents were able to indicate each type of book they had read; thus, these
categories are not mutually exclusive.
Self-Perceptions of Ability to Perform Literacy Activities
Respondents were asked how well they read and write English and how well
they do arithmetic problems when getting the numbers from materials written
in English. Four response options were given: very well, well, not well, and not
at all.
Collaboration
Respondents were asked how much help they get from family members or
friends with various types of everyday literacy tasks. Four response options
were given: a lot, some, a little, and none.

158 . . . . . . Appendix D

APPENDIX E
Participants in the Development Process
and Information About the Authors

Literacy Definition Committee
Ms. Barbara Clark
Regional Manager
Central Region
Los Angeles Public Library
Ms. Nancy Cobb
Manager
Human Resources Development Department
Nabisco Biscuit Company
Ms. Hanna Fingeret
Director
Literacy South
Ms. Evelyn Ganzglass
Director
Employment and Social Services Policy Studies
Center for Policy Research
National Governors’ Association
Mr. Ronald Gillum
Director
Adult Extended Learning Services
Michigan Department of Education
Mr. Karl Haigler
President
The Salem Company
Mr. Carl Kaestle
Professor of Educational Policy Studies
Wisconsin Center for Educational Research
University of Wisconsin
Mr. Reynaldo Macías
(Liaison to the Technical Review Committee)
Professor of Education and Director
UC Linguistic Minority Research Institute
University of California, Santa Barbara

Participants . . . . . . 159

Mr. David Neice
Director of Research and Analysis Directorate
Department of the Secretary of State
Canada
Honorable Carolyn Pollan
(ex-officio member)
State Representative
Arkansas State Legislature
Ms. Lynne Robinson
Director of Support Services
Division of ACE
Sweetwater Union High School District
Mr. Anthony Sarmiento
Director
Education Department
AFL-CIO
Ms. Gail Spangenberg
Vice President and Chief Operating Officer
Business Council for Effective Literacy
Technical Review Committee
Ms. Susan Embretson
Professor
Department of Psychology
University of Kansas
Mr. Jeremy Finn
Professor
Graduate School of Education
SUNY Buffalo
Mr. Robert Glaser
Director
Learning Research and Development Center
University of Pittsburgh
Mr. Ronald Hambleton
Professor
School of Education
Laboratory of Psychometric and Evaluative Research
University of Massachusetts
Mr. Huynh Huynh
Professor
Department of Educational Psychology
University of South Carolina at Columbia
Ms. Sylvia Johnson
Professor
Howard University

160 . . . . . . Participants

Mr. Frank Schmidt
Professor
Industrial Relations and Human Resources
College of Business
University of Iowa
Mr. Richard Venezky
(Liaison to the Literacy Definition Committee)
Professor
Department of Educational Studies
University of Delaware
Literacy of Incarcerated Adults Review Group
Ms. Caroline Wolf Harlow
Statistician
Bureau of Justice Statistics
Mr. Christopher Koch
Education Program Specialist
Office of Correctional Education
U.S. Department of Education
Ms. Harriet Lebowitz
Social Science Research Analysis
Federal Bureau of Prisons
Mr. Ronald Pugsley
Office of Vocational and Adult Education
U.S. Department of Education
Ms. Gail Schwartz
Chief for the Office of Correctional Education
U.S. Department of Education
Literacy of Older Adults Review Group
Ms. Michele Adler
Disability Policy Analyst
Office of Assistant Secretary for Planning and Evaluation
Department of Health and Human Services
Ms. Helen Brown
(Liaison to the Literacy Definition Committee
and the Technical Review Committee)
Research Analyst/Associate
American Association of Retired Persons
Ms. Bella Jacobs
Consultant
National Council on the Aging
Mr. Robert H. Prisuta
Senior Research Associate
Research and Data Resources Department
American Association of Retired Persons

Participants . . . . . . 161

Test Development Consultants
Ms. Valerie de Bellis
Center for Mathematics, Science, and Computer
Education
Rutgers University
Mr. John Dawkins
Language and Literature Department
Bucks County Community College
Ms. Harriet L. Frankel
Secondary and Higher Education Programs
Educational Testing Service
Ms. Bonnie Hole
The Bureau of Evaluation and Student Assessment
Connecticut State Department of Education
Mr. Richard Lesh
Division of Cognitive and Instructional Science
Educational Testing Service
Ms. Ave M. Merritt
Secondary and Higher Education Programs
Educational Testing Service
Mr. Peter Mosenthal
Reading and Language Arts Center
Syracuse University
Ms. Pam Smith
Secondary and Higher Education Programs
Educational Testing Service
Ms. Wallie Walker-Hammond
Secondary and Higher Education Programs
Educational Testing Service

About the Authors
Karl Haigler is President of the Salem Company in Charlotte,
North Carolina.
Caroline Wolf Harlow is a Survey Statistician with the Bureau of
Justice Statistics, U.S. Department of Justice.
Patricia E. O’Connor is a Professor of English at Georgetown
University, Washington, D.C.
Anne Campbell is Director of Test Development for the Literacy
Learning and Assessment Group at Educational Testing Service.

162 . . . . . . Participants

 

 

The Habeas Citebook: Prosecutorial Misconduct Side
Advertise Here 4th Ad
Prisoner Education Guide side