14 How to Prepare Written Reports

Key takeaways for this chapter…

  • There are many potential readers of school psychologists’ reports, each who should be considered during report preparation
  • Because there are many types of referral questions, a single approach to written reports is impossible
  • Written reports, regardless of format, should explicitly answer the referral question(s)
  • Written communication is best when it reduces unnecessary jargon, technical detail, and length
  • Routine use of checklists concerning written communication can promote skill development

Cases/vignettes in this chapter include…

  • Ozel and Stacy, ED or no ED
  • Malik, ADHD
  • Reynaldo and Pablo, two summaries
  • Dimitri, the student or the test
  • Spencer, the limitations of “Teach & Tell”
  • Jenny, author’s motives are unneeded
  • London, be careful about making attributions
  • Leigh Anne, people first wording

 

For many school psychologists, each professional day involves at least some time spent preparing written reports. Their written work, which in school is often quasi-legal, is apt to be scrutinized by administrators, colleagues, and parents. But reports are also clinical. Parents and students trust that a mental health professional will reveal the truth and offer wise, compassionate guidance in a document that can be easily read. And, truth be told, reports can live into the future for years (or even decades). It is unknown when a professional’s report might resurface, surrounded by pleasant or unpleasant circumstances. No wonder written reports create so much anxiety and consume so much time. For these reasons, it is worth taking a deep dive into written report preparation. Let’s start with something simple, a single report format.

A Generic Format for School-Based Reports

As you have already learned, there is no single type of social-emotional evaluation. Thus, it’s no surprise that there is not just one format for organizing a written social-emotional report. Table 14.1 provides a single example. This format, or variants of it, can be used for many referral questions. It is also a good configuration for students practicing report preparation. In some school settings, however, it may be judged unsuitable for questions pertaining to special education or 504 eligibility. This is true when school districts expect their psychologists to document satisfaction of various legal obligations related to special education law.

Table 14.1 One Option for a Generic Psychological Report

Section Content Commentary
Identifying Information Demographics, such as name, birthdate, gender, ethnicity This is simple factual information.
Reason for Referral One or more specific questions that need to be answered Often, this is kept brief so that the purpose of assessment remains salient (see Chapter 3)
Background Information Anecdotal information, narrative, prior comments

Background (and current) information from classroom teachers

Cumulative school records

Records of special services (accessed via confidential file)

Parent-completed history and development forms

Interview with parents face-to-face or by phone

Outside professional reports

Permanent products

This can be extensive and might include subsections to help the reader follow along (see Chapter 4).
Assessment Procedures Examples: Conners Comprehensive Behavior Rating Scale; diagnostic interview, teacher interview, student interview, etc. This is merely a bullet-point-style listing of tests and other procedures.
Findings This is an extensive section where interpretations are recounted stage-by-stage. Scores are often left out of this section and moved to an appendix. Following the HR logic (Chapter 2) is one strategy to organize this section. Many of the recommended practices found in this chapter concern this section of a report.
Summary Concise summarization in one or two paragraphs. This is the view from 30,000 feet, not down in the weeds. This brief section may be the only portion that some users will read.
Diagnostic Impression List any IDEA categories found

List any DSM-5 diagnoses (optional, depending on setting)

Many schools confine statements to one or more of the special educational categories or simply to state “no disability”
Recommendations These should follow logically from what was found. Often, these are brief and help direct teachers and parents to recognized resources, extant programs, or broadly conceived interventions. In general, a school psychologist’s report is not a proper document for telling teachers how to teach their student or parents how to parent their child (i.e., extremely detailed suggestions probably don’t belong in a report).
Signature Name, degree, credential Add physical signature, if possible

It’s important to remember, however, special education eligibility decisions arise from a team’s assessment and a team’s judgment (not just an isolated psychologist). What’s more, these multidisciplinary team (MDT) evaluations almost always have a district-specific template. In other words, the school psychologist working with her team follows this extant MDT template. In addition, there is no universal template and thus no way to prescribe exactly what a beginning school psychology student should expect to encounter in their home school district. That said, consider as an example the headings found in one school district’s MDT (legalistic) evaluation:

  • identifying information
  • type of evaluation
  • eligibility text
  • vision results
  • hearing results
  • team participants
  • referred by
  • reason for referral
  • review of existing data
  • efforts to educate the student in the general classroom
  • attendance and educational history
  • summary of previous assessment
  • background medical and developmental information
  • current classroom-based assessment and performance in general curriculum
  • impact of educational disadvantage and limited English proficiency
  • consideration of need for additional data
  • evaluation procedures
  • classroom observations
  • test behavior
  • assessment results
  • other findings
  • eligibility
  • summary of assessment results in evidence of disability
  • effective disability and educational needs to access the general curriculum
  • accommodations
  • assistive technology
  • additions or modifications to services needed for progress in the general curriculum
  • determination of eligibility

Just what type of material fits under each of these general heading is hard to pre-specify. Rather, school psychologists practicing in this school district would learn over time what type of information goes where. Likewise, a school psychologist employed in this district (or any other one) would also develop a sense of scope, how much detail is required and how many facts are to be included. No matter the number of headings and how much material is needed under each, the list above is an intimidating one. Although other school districts may use different headings, or fewer headings, compiling an eligibility-oriented report can prove exhausting.

The listed headings connote a certain sense—the terms seem special education-related. The entire document would almost certainly convey a quasi-legal orientation. This is true because headings are almost certainly in place to assure preparation of a report that fulfills the LEAs’ legal obligations regarding IDEA. Incidentally, on this same theme, school psychologists can now find books compatible with this orientation (i.e., Writing Useful, Accessible and Legally Defensible Psychoeducational Reports, Hass & Carriere, 2014; underlining added). School administrators, who are often directors of special education, are typically concerned about paperwork audits. They wish to assure that any state-level review of their district’s special education documentation is successful. Administrators recognize that audits often consist of clerks armed with checklists who confirm, student by student, that critical eligibility-related elements are documented. For these reasons, eligibility-related team reports risk becoming rigid, technical, hard to read, and taxing to prepare. Thus, it is hardly a surprise that parents who attend eligibility team meetings become frustrated that the uniqueness of their child and their needs become lost in a process that can seem only marginally humane. What can be done to improve all types of reports, including those that concern special education eligibility? A number of considerations are listed below. Readers will decide which to incorporate in their practices and which to ignore. Most (but arguably not the first two) are optional.

Some Elements that Create a Strong Report

No one has all the answers about what makes a quality report. The following paragraphs, however, seem to represent elements that are worth considering. You might check some of your work against each.

Referral Questions Asked and Answered

As you learned in Chapter 3, referral questions are essential. They establish the boundaries of your responsibility. Questions start the assessment process and the assessment process logically remains incomplete until questions are answered. Thus, school psychologists should always review their reports to assure that all referral questions have been addressed explicitly. But there’s more to it than that. The referral question sets the tone of the report and help specify the target audience for whom it is written. As seen in Table 3.1, there are an array of referral questions that have nothing to do with special education eligibility. In light of this fact, the long and cumbersome report typically prepared to answer special education questions is sometimes unneeded. In other words, the length and contents of reports should be driven by the referral question. For example, a circumscribed referral question about the prospect of social skills problems sufficiently severe to warrant enrollment in a social skills program (see the case of Angela in Chapter 1), would call for a report different from one when questions related to special education eligibility. In the first case, detailed documentation of adverse educational impact is unneeded. The report may be very, very brief. A list of recommendations would not be required (a social skills program is already on the horizon pending confirmation of skill deficits). In the second case (the one concerning special education), however, attendance patterns, classroom productivity, current academic skill levels, as well as changes to the trajectory of academic skill acquisition are mandatory. So too in the second case is the need to nail down the nature, severity, and chronicity of an emotional problem. A written report that must accomplish all of these things cannot possibly be extremely brief. Similarly, if a referral concerns the prospect of self-harm, then information bearing directly on that question can indicate which headings are needed (if any) and the type of information that ought to appear under each heading. In situations where risk of suicide is at issue, otherwise highly relevant background information (e.g., developmental history, permanent products, record of special services) may take a backseat to direct inquiries with the student, parent, and teachers. The resulting information might be organized around crucial topics like signs of depression, mood instability, prior suicidal comments and prior suicidal actions, not scores on all-purpose rating scales. Under circumstances like this, ad hoc headings are sometimes advised as the only means of producing a coherent and uncluttered document. Even though a generic style report (e.g., Table 14.1) may prove to be a useful default option, it should never straitjacket the writing of thoughtful school psychologists.

All Elements Confirmed in Eligibility Cases

You saw in Chapter 2 that checklists can help tame faulty, Automatic System-linked judgments. They help slow thinking and assure that any relevant objective criteria are considered one-by-one. Chapter 10 provides a checklist for ED determination. School psychologists (and their teams) are urged to use such a checklist for each prospective ED determination. You are also urged to address in writing, somewhere in your report, your analysis of each ED (or autism, or OHI, or Section 504) criterion. Your words regarding eligibility might appear in a report summary or, alternatively, in a section that specifically concerns special education eligibility. Or perhaps there is a serial recital of each criterion and what you concluded regarding each that would be found elsewhere in the report. Consider the following two examples:

  • “In summary, as the information above indicates, Ozel appears to satisfy criteria for the special education category of emotional disturbance. Specifically, each of the following necessary elements is judged to be present: ‘an inability to build or maintain satisfactory interpersonal relationships with peers and teachers.’ This has been true of him over a long period of time, to a marked degree, and in a way that has adversely affected his educational performance. In addition, he is in need of special education and related services.”
  • “Taken together, assessment findings indicate the following. Stacy does not satisfy criteria for the special education category of emotional disturbance. Although she now seems to exhibit marked problems with “inappropriate types of behavior and feelings under normal circumstances” (as required by the emotional disturbance definition), this has not been true of her over a long period of time (as also required by the emotional disturbance definition). Moreover, the critical criterion of adverse educational impact is not met (i.e., academic grades, skill development, classroom productivity, and attendance are all currently adequate).”
    Figure 14.1 Checklists, even informal ones, can help avert mistakes. Photo by Glenn Carsten-Peters on Unsplash.

Recommendations that Spring from Understanding and Match Conclusions

Psychology’s longstanding research goals are to understand, predict, and control (improve) human behavior (Allport, 1940). In some ways, psychology’s most basic practice goals are identical. Thus, it makes sense that written recommendations link these three fundamental aspirations. Simply put, assessment can create understanding. Understanding in turn can help predict whether current behavior is likely to persist unchanged in intensity and severity (i.e., existing problems will presumably continue, or worsen, absent intervention) or self-correct as time passes. The former logically need recommendations. The latter do not. Understanding also fuels intelligent efforts at improving (controlling) behavior. For example, once it is recognized that a boy in middle school has ADHD, his history of lost assignments, inadequate planning, and chaotic time management makes more sense (i.e., he and his actions are better “understood”). By the same token, research suggests that most school problems associated with ADHD do not spontaneously self-correct (the future of his problems can be at least partially “predicted”). Thus, standard interventions for ADHD-associated academic and classroom productivity problems can logically be recommended (i.e., steps to “control/improve” are green lighted). Recommendations to alleviate frustration and boost self-confidence (required for some students with ADHD, but not all) might also be needed. You might have noticed that the first set of recommendations spring from a nomothetic orientation. In this sense, the student is first understood to be an outlier regarding attention and to be suffering as a consequence; off-the-shelf treatments documented to work by group-level research are thus indicated. In contrast, the second set of recommendations (deal with frustration and self-confidence) idiographic in orientation. The student’s unique reaction includes his own mixture of frustration and poor self-conceptualization. The best (and most complete) plan for him arguably arises from understanding garnered by simultaneous focus on nomothetic and idiographic orientations. For this very reason the HR Worksheet, as a tool for envisioning recommendations, prompts school psychologists to consider both orientations as they reach their conclusions.

This suggests that assessment-enabled understanding is a prerequisite for formulating targeted recommendations. If you fail to understand students, you cannot reasonably program for them. But understanding alone is also arguably insufficient. Consider this now quarter-century-old statement, “There is nothing inherent in the assessment process itself or psychological tests that indicates which programming recommendations should be developed. That is, tests do not make programming recommendations; psychologists do” (Wodrich, 1997, p. 335). This is a key reason that assessments are conducted by skilled professionals. Once school psychologists understand (and determine an intervention is needed), they are also prepared to suggest the array of interventions tailored to the school environment in which they work. Some of these are general, they come from group-level research, and some are specific, they are created on a case-by-case bases to respond to the uniqueness of a student and his particular needs. Consequently, formulating interventions and incorporating them in written reports is no simple undertaking. Just as one’s general assessment skills develop over the course of a career, the same is true regarding one’s facility at formulating written recommendations. You will see more about this in Chapter 15 where several psychologists learn ways to formulate strong recommendations.

Recommendations that Respect Roles

School-based practice requires special sensitivity to your role and that of others. When assessment cases come to you on a school campus, you are typically serving as a consultant. Unlike colleagues in a clinic who might assess to inform their own (and singular) treatment of a child, you are not an enduring case owner. In the days leading up to your involvement, the student had a teacher (perhaps several) and they had parents/guardians. In the days following the conclusion of your assessment, the same teacher(s) and parents/guardians remain in place. Your involvement, however, may be finished. For reasons like this, your role may be less central than it seems to you. Moreover, other team members possess their own expertise and their own perspectives. For these reasons and a multitude of others, you are advised to remain keenly circumspect. Know the boundaries of your role. You are not the supervisor of other case stakeholders (e.g., special education teachers, regular education teachers, counselors, parents) who might implement any of your recommendations. What’s more, in the eyes of some readers, your written recommendations may possess limited credibility. A dictatorial tone is apt to backfire. Recommendations that are too detailed and too prescriptive may insult a reader already familiar with the specifics of your suggested interventions. You are not there to tell everyone what they should do and just how they are to do it. They are professionals. Consider their needs and potential reactions to your recommendations. Table 14.2 concerns an elementary student with ADHD, Malik. In the table, you see three levels of detailedness reflected in just two recommendations that might be found in a written report. Of course, there may be five or more recommendations found in Malik’s full report, but just considering these two helps you appreciate the distinctions among recommendations with little detail, those with moderate detail (placed at the end of the table to make the distinctions more salient) and those with extensive detail. After you have read them, consider the pluses and minuses of each level of detailedness. Also consider reading the recommendations found in the report of Marie Kofi’s case of autism found in Chapter 12. It might help you to appreciate how level of detailedness works when an entire report is available for review.

Table 14.2 Recommendations Varying in Detailedness

Recommendations: Little detail
1. OHI services

2. Behavioral strategies in the classroom

Recommendations: Extensive detail
1. Because Malik has ADHD, as described above, and because he meets the state’s criteria for Other Health Impairment (OHI) services, as described above, the MET 2 team is advised to meet and formally offer services. Because federal rules and regulations related to special education apply to students with OHI, Malik’s services should adhere to several principles. First, Malik is entitled to a Free Appropriate Public Education (FAPE). His plan must offer all he needs to assure that he is properly educated. This is true even if extensive special education and related services are required. Another key consideration for Malik’s plan is the idea of least restrictive placement. He should not be placed in a separate classroom and he probably does not require time in the resource room. Rather, classroom modifications and supports are likely to be sufficient to assure FAPE while allowing for the least restrictive placement possible.

2.  A classroom configuration and instructional practices well suited to most students may fail to meet the needs of students with ADHD, including Malik. For this reason, several classroom modification and teaching style practices might be emphasized by Malik’s teacher.  (a.) Highlight classroom rules by having them posted. Using this technique, Malik’s teacher can remind him (such as by pointing or calling out a rule’s number) when a rule has been violated or is about to be violated. In parallel, all students (including Malik) can be reinforced verbally for rule following. An example is, “I like the way everyone is now following the rules. This includes Rule #1 (inside voice), Rule #2 (only one student visits the teacher’s desk at a time), Rule #3 (start working right away and work quietly to the end), Rule #4 (compliment classmates and yourself for work well done).” (b.) Get close to each student by moving about the classroom. This practice would permit positive reinforcement when Malik (as well as classmates) are responding well. It would also allow easier observation of each student during seatwork. These helpful educational practices are often limited when a teacher remains tied to their desk. (c.) Minimize individualized verbal reprimands. To the extent possible, nonverbal cues, such as eye contact or gestures, should be used when Malik requires redirection. This might involve, for example, his teacher silently looking at Malik and raising their eyebrows or tilting their head to prompt him back to work. Similarly, nonverbal nudges to remind him of posted rules may work without calling attention to Malik, actions that might inadvertently reinforce unwanted behavior or unnecessarily embarrass him. (c.) Adopt a rapid instructional tempo. Long directions, lengthy explanations, extensive lectures or extended intervals between instructional activities are likely to cause problems for Malik. In contrast, activities and assignments that can be started, finished (correctly), and reinforced verbally are likely to be well suited to Malik’s particular needs as a student with ADHD. An enthusiastic, upbeat voice tone and pace of speaking help capture and maintain attention. In contrast, when a teacher speaks in a monotone, too softly or too slowly, students’ attention may not be fully captured or may drift. (d.) Assure assignments are fully understood. Students with ADHD may fail to grasp the nature and detailed requirements of assigned work. They often require special monitoring to make sure they are working correctly. To make sure that he is working well, Malik should receive extra oversight at the outset of instructional activities. (e.) Carefully manage transitions. For students with ADHD, disruptive behavior and tardy task initiation are particular risks during all transitions. To minimize these risks, it is advised to fully explain what students are to do, and precisely when they are to do it, as one activity winds down and another activity starts. Managing transitions can foster success and minimize failure.

There are also a number of strategies that might be classmate mediated. Some of these that are recommended for Malik follow: (a.) Use peer tutoring to boost academic success. This strategy involves the entire class, not just Malik. The entire class is split into two teams. Within teams, students work in dyads as they take turns coaching one another by following an academic script. Points are assigned to provide feedback regarding correct responses coupled with a chance for bonus points that can be awarded to pairs that work cooperatively. Points can be used to track the progress of pairs and entire teams (the winning teams receive modest praise from their competitors). (b.) Establish a peer-mediated conflict resolution program. This might help with playground behavioral excesses for Malik, but it also can aid students in general. The overall approach involves designated peers (specially appointed students with sterling academic and behavioral records) to serve as playground monitors. Monitors catch students behaving well and reinforce their actions; they may also prompt more acceptable behavior.

Malik might also benefit from implementation of self-monitoring strategies. These are as follows: (a.) Develop a program for self-reinforcement. A self-reinforcement program can be simple, such as creating clearly described tasks and specified criteria for success matching each task. Students like Malik would then earn points based on personal monitoring of their own success. Additional points might be dispensed when Malik makes accurate self-appraisals (i.e., self-ratings that match teacher ratings). (b.) Create a checklist and use it to promote self-monitoring of organization. To aid in preparedness for class and to expand the chances for instructional success, Malik could have a checklist created for him. This might include all steps necessary to receive academic instruction (e.g., notebook on desktop, proper text at hand, sharpened pencil on desk, eraser on desk, seated and attentive when teacher begins to speak, etc.). Students rate themselves by checking off items satisfied. Similar checklists can be created and applied to other troublesome tasks, such as homework preparation, homework completion, and homework’s return to school.

Recommendations: Moderate detail
1. Consider OHI services by convening a MET 2 meeting. The nature of Malik’s needs, coupled with the importance of mainstreaming, suggest most services would be best provided in his regular classroom.

2. Emphasize classroom behavioral interventions. Among these are classroom modifications and teaching style practices including: posting classroom rules; close teacher-student proximity and copious verbal reinforcement; minimization of verbal reprimands, rapid instructional tempo; assuring that assignments are fully understood; careful management of transitions. For teachers unfamiliar with these principles, or for those seeking a refresher, a detailed handout (ADHD in the Classroom) is available in the Special Education office to aid implementation.

Peer-mediated strategies, such as the following, might also help Malik: a peer tutoring intervention for use in the classroom; a peer-mediated conflict resolution program for use on the playground. For teachers unfamiliar with these interventions, or for those seeking a refresher, a detailed handout (ADHD in the Classroom) is available in the Special Education office to aid implementation.

Consider self-monitoring strategies. Included for consideration are: a self-reinforcement program for in-class work completion and organization; a checklist for self-monitoring of organization and preparation to receive instruction. For teachers unfamiliar with these principles, or for those seeking a refresher, a detailed handout (ADHD in the Classroom) is available in the Special Education office to aid implementation.

ADHD interventions were modified from those found in Wolraich and DuPaul, 2010

A Succinct Summary

All school reports, especially those that concern special education eligibility, risk becoming very long. To help the reader, prepare a sound summary section. Parents, for example, will often have trouble following all the details and rationale that are typically embedded in a full-length report. In contrast, they may be able to understand and remember a summary. Likewise, when one school psychologist turns to a report written three years earlier by a colleague, a concise summary can be a godsend. Indeed, experts reviewing the literature (Wiener & Costaris, 2011) and those advising how to prepare reports (Braaten, 2007) highlight the importance of well-developed summaries. Consider two sets of examples below in Table 14.3.

Table 14.3 Examples of More and Less Helpful Written Report Summaries

Reynaldo

More helpful

Less helpful

Reynaldo is a second grader with declining academic productivity and slipping grades. He was referred for consideration of special education and related services, with particular attention to emotional disturbance. Diverse assessment information from several sources point toward internalizing problems, such as anxiety and depression. Reynaldo appears to be anxious, avoidant, and easily discouraged. These factors probably contribute to declining grades. Reynaldo has seemed especially disengaged in math, and indeed he now produces a low score on a formal measure of mathematics calculation. Consequently, Reynaldo appears to be eligible for special education designation in the category of emotional disturbance. He meets all criteria, including the characteristic of “a general pervasive mood of unhappiness or depression” over a long period of time and to a marked degree that adversely affects his educational performance. Additionally, he needs special education and related services to assure a free appropriate public education. Classroom interventions aimed to diminish avoidance and increase engagement, especially during mathematics, are advised. Also advised are remedial math instruction with a resource teacher and participation with a counselor or school psychologist to reduce symptoms of anxiety, which currently appear to be more significant than depression. Improvement on these dimensions may increase educational performance. This is a summary of Reynaldo’s eligibility evaluation for possible special education services. Reynaldo is in the second grade. Reynaldo scores on the Behavior Assessment System for Children-3 (BASC-3) included many scores in the clinical range. All of the validity indicators on the BASC-3 were within the average range. What was found were high scores on Depression with a T-score of 81 and Anxiety with a T-score of 84 on the parent version of the BASC-3. On the teacher version of the BASC-3, Reynaldo had a Depression T-score of 84 and an Anxiety T-score of 90. These are also high. The composite score entitled Internalizing Problems on both of these BASC-3 scales was elevated for Reynaldo. Reynaldo was not administered the self-report version of the BASC-3. Instead, it was decided that a clinical interview might illuminate Reynaldo’s problems. On a clinical interview that inquired into many areas of possible problems, Reynaldo self-reported anxiety but not depression. Reynaldo seemed reluctant during this interview. Reynaldo has low grades in school and low scores on the Wechsler Individual Achievement Test-III (WIAT-III) during a recent testing session. His lowest score on the WIAT-III was in math calculation. This standard score was 78. Reynaldo would benefit from special education. Resource teachers have more time than general education teachers to address the needs of students like Reynaldo. Working with the resource teacher might help him to catch up in math.

Pabla

More helpful

Less helpful

Pabla’s parents sought an assessment of their healthy, normally developing, second-grade daughter to determine if ADHD was the genesis of her classroom behavior problems. They were, however, uninterested in special education or formal accommodations. Pabla expressed relatively few symptoms of ADHD based on classroom observation, teacher interview, and parent interview. Similarly, there was little to confirm the presence of ADHD on objective rating scales (i.e., Behavior Asssessment System for Children-3), where dimensions related to inattention and hyperactivity were only mildly atypical. In contrast, Pabla expressed many symptoms of oppositional defiant disorder, a condition of poor cooperation, rule testing, and challenges to authority and discipline. Pabla and her parents were encouraged to participate in an out-of-school behavior management training program. Pabla’s school counselor was designated a liaison with her classroom teacher to assure that the new home-based discipline program is compatible with school discipline practices. Pabla is in the second grade. She is healthy, and she has had normal development. She was not evaluated in this instance for special education. Her parents wanted to know if she might have ADHD, but they did not want to have her placed in any special education program or to get any 504 accommodations even if ADHD was present. The parents just wanted to become better informed. This evaluator administered several standardized rating scales (i.e., Behavior Assessment System for Children-3, Teacher Report Form [TRF] and Parent Report Form [PRF]). The highest score on the BASC-3 TRF was Conduct Problems with a T-score of 84. The highest score on the BASC-3 PRF was also Conduct Problems with a T-score of 80. Inattention was not in the at-risk or clinical range on the PRF. Hyperactivity was also not in the at risk or clinical range on the PRF. Neither Hyperactivity nor Attention Problems was in the at risk or clinical range on PRF or TRF. It seems likely that the problem is more a behavioral one than an ADHD one. No special education services were recommended for Pabla at this time.

A good summary should be brief. Consider for context the abstract that accompanies published research articles. Even lengthy articles (e.g., 25 pages of text and tables) contain short abstracts. The cap on words is often 150, with 250 allowed for a few journals. Researchers almost always wish that they could include more in their abstracts, but the point is that to do so would defeat the goal of concise summarization. The situation with psychological reports is comparable. School psychologists would typically like to add more to their summaries, but they can learn (like their researcher counterparts) to convey essential information in just a few words. To this point, the “more helpful” summary of Reynaldo (Table 14.3) totals 200 words. The more helpful summary for Pabla (Table 14.3) is a mere 143 words. Even the “less helpful” versions for both of these students remain relatively brief. This seems to contrast with much of what is seen in the field. It’s not uncommon to encounter a 20-page psychoeducational evaluation. Worse, the 20-page report might include a summary whose own length spans one or two pages. In fact, at times it seems that the two-page summary found near the end of a 20-page report would have better stood alone as the report itself (with a little lengthening). In other words, 18 pages might have been deleted without harming a reader’s understanding.

Focus on the Student

As emphasized throughout this book, test manuals should be read before the tests themselves are dusted off and used. Rich in technical information, the test manuals nonetheless are generally poor in guiding synthesis of diverse assessment data (the kind of diverse assessment data that characterizes most of your cases). Specifically, tests manuals do not tell school psychologists how to blend background information, base rates, or observations together with their scores to reach integrated conclusions. Instead, they may indicate how results from tests standing alone should be described or “interpreted.” A test-by-test interpretation, however, risks losing the child. A test-center approach also risks incoherence as it forces all readers (especially parents) to confront jargon-filled narratives.

A better approach is to place students front and center in reports that are, after all, about them. This approach can describe seamlessly a student’s health, development, social life, academic status, social-emotional functioning and intervention needs. It arguably helps all potential readers to understand findings and grasp how recommendations might arise from these very findings. Consider the two competing styles in Table 14.4.

Table 14.4 Writing Style that Emphasizes the Student over the Test

Test-oriented paragraph

Student-oriented paragraph

The Behavior Assessment System for Children-Third Edition (BASC-3), a well-recognized, standardized, and validated broadband rating scale, was completed by the classroom teacher of Dimitri. The BASC-3 contains several validity indicators. The F Index, L index, and V Index scores were within the typical range. This suggests that BASC-3 scores from Dimitri’s teacher do not require modified or cautious interpretation. The BASC-3 composite score entitled the Behavioral Symptom Index was examined next. This score was elevated into the Clinically Significant range in Dimitri’s case. A score this high suggests the presence of “pervasive and serious behavioral or emotional problems.” The next step is to interpret two composite scores (which are composite scores based on theory or factor analysis of BASC-3 items). The first composite score, Externalizing Problems, was in the Average Range. But the second composite score, Internalizing Problems, was elevated and in the Clinically Significant Range. Consequently, BASC-3 individual clinical scales, which measure “maladaptive behaviors and problematic levels of functioning” were examined. Two of the three clinical BASC-3 scales that make up the Internalizing Problems composite were elevated into the Clinically Significant Range. These scales were Depression and Anxiety. It is noteworthy that other sources of assessment data had already raised concern about anxiety and depression. Specifically, a detailed History and Development form completed by Dimitri’s parents contained several comments suggestive of anxiety or depression. Dimitri’s teacher was also interviewed. Her answers to oral questions also suggested anxiety and/or depression. Dimitri’s chances of either anxiety or depression, already implied by parents’ comments and teacher’s interview, were boosted by scores on objective rating scales (i.e., Behavior Assessment System for Children-3). Favorably, Dimitri’s teacher produced no evidence of invalid responding. But Dimitri’s teacher-generated profile is characterized by indications of serious emotional and behavior problems overall. Furthermore, he appears to experience internalizing problems. More narrowly, Dimitri’s internalizing problems seem to include a subjective sense of anxiety as well as symptoms of depression.

 

Little or No  “Teach & Tell” 

All school psychologists, of course, hope to convey written information clearly, truthfully, and supportively. Toward the goal of being supportive, it has become common practice to use portions of written reports as teaching tools. For example, a report might first describe the characteristics and types of scores found on an autism rating scale and then go on to indicate on this same scale a student’s exact scores. Why might this be done? Is this in an effort to support non-expert readers or might it reflect something less genuinely considerate?

In part, the practice of writing so as to teach the readership may have sprung from graduate training. During training students learn to write about scores on behavior rating scales, cognitive tests, and achievement measures. Sometimes (often) scores are provided when there is no real case. In other words, trainees may be given a set of scores but no referral question, no background information and no opportunity to observe or interview a student (whose existence is hypothetical rather than real). In situations like this, students-in-training are understandably left with little to interpret other than test scores themselves. Describing a test’s characteristics as a preamble to score interpretation makes a certain amount of sense under these contrived circumstances. Perhaps unfortunately, however, habits like these may carry over from graduate school to school-based practice. In fairness, some school psychologists make a rational decision to teach their readers about each assessment tool before they describe a student’s scores on that tool. The justification for this practice seems to be that once parents or teachers have general information about an assessment tool, then they will be prepared to understand a particular student’s scores.

Whatever its source, this seem to represent a two-fisted approach–first, I will teach you (generally); then, I will tell you (particularly). Simply put, this is the teach and tell strategy (T & T). Because of its widespread use, the T & T is worth considering. Start with a second-grader named Spencer, as found in Table 14.5. The material in italics represents the “teach” element of Spencer’s report. The non-italicized material represents the “tell” element. For now, read both parts of this single section of Spencer’s report. Judge for yourself how well the T & T approach works in this case. Let’s make it a bit more specific. Try to determine if this document suggests that Spencer has (or does not have) a social-emotional problem. And, if Spencer has a problem, has the T & T technique made its nature more easily grasped?

Table 14.5 “Teach and Tell:” Behavioral Rating Scales

The Behavior Assessment System for Children, Third Edition (BASC-3) is a multidimensional, multimethod set of tools that psychologists use to help determine the prospect of social-emotional problems, behavioral characteristics and self-perceptions of children and young adults. In schools, among the most-used elements of the BASC-3 system are behavioral rating scales that can be completed by parents, teachers, or the students themselves. On such rating scales, informants endorse a large number of items using a four-point Likert scale (0, 1, 2, 3). Endorsed items, in turn, allow generation of standardized scores on scales related to validity, emotional functioning, psychopathology, and adaptive skills. Standardized scores for each student depend on comparison with a large, representative national sample. For each student, T-scores are reported, which have a mean of 50 and a standard deviation of 10. Percentile ranks may also be reported. For BASC-3 clinical scales, T-scores of 70 and above are considered to be “clinically significant,” whereas T-scores between 60 and 69 are considered to be “at risk.” In contrast, for adaptive scales, T-scores of 30 and less are considered to be “clinically significant,” whereas T-scores between 31 and 40 are said to be “at risk.”

Spencer’s teacher, Deborah Alloway of Aspen Heights Elementary School, completed the BASC-3 Teacher Rating Scale (TRS). Regarding the TRS’s validity scales, Ms. Alloway’s F Index, Response Pattern Index, and Consistency Index were all in the “acceptable” range. Concerning composite scores, Spencer’s Behavioral Symptoms Index had a T-score of 59 (82nd percentile), which is in the average range. His Externalizing Problems T-score was 51 (66th percentile) and his Internalizing Problems T-score was 47 (48th percentile). Both of these composite scores also fall in the average range. Each clinical scale that contributes to the Externalizing Problems Composite, was also in the average range. These are: Hyperactivity (T-score = 43; percentile rank = 29th), Aggression (T-score = 53; percentile rank = 77th), and Conduct Problems (T-score = 57; percentile rank = 79th). Likewise, each clinical scale that contributes to the Internalizing Problems Composite was also in the average range. These are: Anxiety (T-score = 50; percentile rank = 60th), Depression (T-score = 48; percentile rank = 54th), and Somatization (T-score = 44; percentile rank = 29th). However, Spencer’s School Problems composite T-score was 81 (99th percentile) and in the clinically significant range. Regarding the two clinical scales that comprise the School Problems composite score, both Attention Problems (T-score = 75; percentile rank = 99th) and Learning Problems (T-score = 82; percentile rank = 99th) were associated with scores in the clinically significant range. Turning to adaptive scales, Spencer’s composite score entitled Adaptive Skills was in the average range (T-score = 43; percentile rank = 27th). Nonetheless, two adaptive scale T-scores fell in the at risk range. These were: Leadership (T-score = 35; percentile rank = 6th) and Study Skills (T-score = 33; percentile rank = 6th). Adaptive scales associated with average range scores were: Adaptability (T-score = 59th; percentile rank = 80th), Social Skills (T-score = 48; percentile rank = 43rd) and Functional Communication (T-score = 47; percentile rank = 37th).

It can be argued that the T & T technique might end up doing more harm than good. To date, however, empirical research on this topic appears to be lacking. Several suggestions found in this chapter might make the T & T technique irrelevant (e.g, focus on the child). Incidentally, in the case of Marie Koffi found in Chapter 12 (and Appendix C), Sally Thompson demonstrates how an entire report might be written without use of the T & T technique.

Little about the Author’s Motives

Part of the process of mastering beginning report writing is convincing your supervisor (or your professor) that you know what you are doing. Consequently, school psychologists still in training often prepare their reports with their supervisor in mind. Under these circumstances, sharing the logic of your assessment plan with them might make sense. But your prime audience will become, or may already in fact be, teachers and parents. These stakeholders are probably uninterested (or potentially even confused by) descriptions of your rationale. In general, it is probably best left out. Consider the example of a small portion of one school psychologist’s report about an elementary-age girl, Jenny.

“The first goal of the assessment plan was to understand the perspective of Jenny’s teacher. A teacher interview revealed there was concern about ‘social isolation and unusual mannerisms.’ Because this seemed like possible autism spectrum disorder, a narrow-band rating scale concerning autism was deemed appropriate. The autism scale with the best validity was selected because it could help maximize correct detections of autism and minimize false detections. When scores on this scale were in the average range, another strategy was selected. It involved a structured diagnostic interview (MINI-KID). This technique was well-suited to Jenny’s case because it possesses satisfactory evidence of reliability and validity.” This is probably too much explaining to benefit most readers, notwithstanding its ability to share with a supervisor the assessment plan of a novice school psychologist. Remember your audience (which includes parents and teachers) and write accordingly. Also, consider the “less helpful” version of a summary prepared for Reynaldo (Table 14.3), which illustrates comparable elements of justification.

Carefully Managed Test Scores

History sometimes offers perspective. When psychologists first began to work in schools many years ago test administration and interpretation was a prime goal. But the world was a lot simpler back then. Often one or two tests, like the original Stanford-Binet Intelligence Test (Terman & Merrill, 1937), was administered. Furthermore, each test typically produced just one score (e.g., an Intelligence Quotient or IQ). An IQ score, as an example, was always reported in the body of a report. As more multi-faceted tests were developed (e.g., the original Wechsler Intelligence Scale for Children; Wechsler, 1949) there became much more to report, such as a Verbal IQ, a Performance IQ, and a Full Scale IQ (plus subtest scores). In today’s world, of course, several tests are administered during almost every evaluation; even the most straightforward among these includes several sub-scales. For example, the Revised Children’s Manifest Anxiety Scale-2 produces a composite score, an index of three subtypes of anxiety, a defensiveness index, and a measure of consistency (Reynolds & Richmond, 2008). Similarly, consider the standard computer printout associated with the BASC-3 TRF or PRF like you saw in Chapter 5 (Reynolds & Kamphaus, 2015). There are multiple validity indexes, several composite scores, plus a number of clinical scales and adaptive scales. Furthermore, additional content scales are available, as are pages and pages concerning critical items, DSM-5 diagnostic considerations and the like. The task of deciding what is most important in a 28-page printout associated with just one informant’s responses to just one rating scale can prove unnerving.

To be clear, there’s little doubt that vast assessment information like is found in the BASC-3 enriches today’s assessment practices. But sometimes one can become too rich. This means that many school psychologists leave test scores out of the bodies of their reports. Instead, scores are reported only in an appendix (sometimes referred to as a “Psychometric Summary,” see the report of Marie Kofi in Appendix C). There is a rationale besides merely managing the vastness of scores vying for inclusion. A school psychologist in Arizona, Joel Hanania, asserted the following, “Tests scores, explanations of test scores, and explanations of tests are not included in the body of the report, as these do not add to the reader’s understanding of the child’s problems.” (2011, p. 187). Consider how much of the non-italicized information about Spencer in Table 14.5 might be shifted to an appended Psychometric Summary. Also consider the increased freedom afforded a school psychologist to describe the student and hypotheses about him if Dr. Hanania’s recommendation to strip the body of a report of all test scores were followed. For example, what if no test scores were included in the “less helpful” example of Reynaldo (Table 14.3)?

Miller and Watkins (2010) point out that parents’ reading of reports often leads to confusion. Consequently, these researchers investigated the effects of visual aids on parents’ understanding. They found that understanding of psychoeducational evaluations (comprising IQ and achievement scores) was enhanced by use of simple graphs (with just a few scores included). The same might be true with social-emotional scores. However, if school psychologists conduct an adequate feedback session, then parents should be less confused and less dependent on written documents to understand assessment findings. This is especially true of test scores, which are ripe for confusion. When you read Spencer’s case (Table 14.5) you might question if parents would have been able to make sense of the substantial array of subtest titles. For example, what might the BASC-3 subtest title “atypicality” or “somatization” connote to a mainstream parent? On the same theme, the lengthy BASC-3 TRF printout contains two clear figures (one reporting Clinical and Adaptive T-Score Profiles; another Content Scales and Index T-score Profiles). One graphic profile contains 20 dimensions and the other contains 11 dimensions. Not only are T-scores reported but so are percentile ranks. Might all this fancy graphic material associated with all of these psychological terms actually help parents to understand? Perhaps. Perhaps not. Again, we seem to lack empirical research to help guide our practices. Especially for those who think not, in order to satisfy the ethical requirement to explain findings listed above, a reasonable approach might be the following:

  • Always have a feedback session with parents
  • Leave parents with contact information so that follow-up oral explanations can occur, as needed
  • Add a statement to all written reports that scores should not be interpreted by parents (parents should defer to the report itself; see sample wording in the Psychometric Summary of Marie Kofi in Chapter 12).

No Ambiguity about Outlier Scores

Psychologists become so familiar with the organization of their instruments, including the metrics associated with scores, that they risk forgetting that this knowledge is far from universal. When you write that a student had a high score on the BASC-3 Hyperactivity scale, you know implicitly that “high” denotes a potential problem. But the general readership of psychological reports would possess no a priori knowledge about whether “high” denotes a problem or a strength. Worse, on the BASC-3, high scores are sometimes indicative of problems (e.g., on content scales), whereas at other times low scores are indicative of problems (e.g., on adaptive scales). Thus, it is often better to select one of several problem-minimizing writing strategies that clarify the direction of outlier (i.e., statistically uncommon) scores.

The first is to skip references to high and low (or elevated and depressed) scores in favor of other, non-directional terms. For example, replace “He had a higher score on the BASC-3 Attention Problems scales than on any other scale” with “He had a worse [or more problematic or more unfavorable or more concerning or more worrisome] score on the BASC-3 Inattention scale than on any other scale.”

A second method is to anchor descriptions with parenthetic elaborations. This practice sometimes removes ambiguity. For example: “Willow had an elevated score (where high scores indicate problems) on the BASC-3 Internalizing Problems scale but low scores on the Adaptive Behavior composite (where low scores denote problems).

A third method is to add evaluative terminology to help the reader understand your interpretations. For example: “Unfortunately, Edward had elevations on both scales related to inattention and hyperactivity-impulsivity. He also had troublingly low scores on measures of adaptive competence.” In this example, ambiguity about the meaning of elevation is reduced by the use of “unfortunately.” Similarly, ambiguity about the meaning of a low score concerning adaptive behavior is removed by the use of “troublingly.” The same is true regarding adding the idea of “competence” to phrasing about adaptive behavior (i.e., every reader knows that being low on any index of competence represents something negative). You might re-examine the description of Spencer’s test scores (Table 14.5) regarding this same point.

Interpretation, Not Mere Description

Especially when beginning their careers, school psychologists seem prone to merely report what they have found.  Inference-making is rare. Scores are discovered, and scores are reported. A student is observed in their classroom, and their actions are reported without refinement or analysis. This produces a style of report that is apt to be hard to read and comes across as devoid of professionalism. Consider the “less helpful” summary of Pabla, as found in Table 14.3. This school psychologist has written a summary with little interpretation. Instead, school psychologists are encouraged to interpret test results, observations, and background information, not just recite these things (Wodrich, 1997). In contrast to the “less helpful” version, Pabla’s “more helpful” summary is all about analysis, not regurgitation. This reality is evident in the following interpretation-rich phrasing found in Table 14.3, “Pabla expressed relatively few symptoms of ADHD based on classroom observation, teacher interview, and parent interview. Similarly, there was little to confirm the presence of ADHD on objective rating scales…”

Fitting shoulder to shoulder with the notion of interpretation’s primacy, is the imperative to find critical information and separate it from the unimportant (Hass & Carriere, 2014). To do so helps the school psychologist produce an understandable report organized around big ideas coupled with a supporting, coherent narrative (Braaten, 2007). An approach like this suggests that each written report should provide a coherent story beginning right from the outset of recounting background information and continuing to the very last sentence of a summary. As you learned in Chapter 2, formulating a set of hypotheses, weighing information relative to these hypotheses, generating new hypotheses, and ultimately moving toward conclusions involves thinking. It is not a matter of just collecting information. So too for written reports.

Unfortunately, it is not uncommon to read a long report that is packed with observational anecdotes, unvarnished facts about developmental history, plus objective tests scores but that delivers little actual analysis of the student. Facts, devoid of interpretation and inference, prove extremely difficult for the reader to follow. Readers may struggle because they have been bombarded with too many facts for their working memories to hold. They sometimes scratch their heads about why they are even asked to read some seemingly tangential (irrelevant) information. An unfortunate variation of this scenario is for readers to find themselves at the final page of a long report still without a good idea of what conclusions are about to be reached. In other words, even nearing the end of a report, they remain unsure of the nature of the student’s problems, if there is a diagnosis upcoming, or even whether the student is about to be judged eligible for special education services. We should offer our readers something better than this.

Simple, Everyday Language

School psychologists sometimes fall into the use of large, sophisticated-sounding words familiar only to other psychologists. This can occur unconsciously. After all, novice school psychologists have just spent their preceding years speaking mostly with other graduate students or with professors. What’s more, in an effort to appear authoritative, school psychologists’ language can actually become erudite (Wiener & Costaris, 2011). This gives rise to use of elevated vocabulary coupled with the adoption of dubious strategies that leaves written reports oddly stilted (e.g., third person references to “the examiner” when first person references would be easier to understand; Wiener & Costaris). Jargon abounds. It’s easy to lose sight that two of the prime audiences for psychological reports (parents and teachers) don’t use this jargon. Table 14.6 provides examples of simplified wording.

Table 14.6 Examples of jargon and simplified alternatives

Jargon

Simplified language

During his prodromal stage, Olaf expressed both positive and negative signs and symptoms of schizophrenia. Olaf’s early stages of schizophrenia included hallucinations and diminished functioning.
The latent factor structure of the instrument was compatible with fewer dimensions than the test’s authors claimed, constraining interpretation of clients’ scores. Because this rating scale may not measure as many traits as once believed, psychologists should stick to simple score interpretations.
Obsessive worry about cleanliness seemed to prompt Cherise to engage in ritualistic handwashing in a sequence involving a negative reinforcement paradigm. Cherise seemed to worry about germs and dirt before handwashing. Each time she washed her hands her level of worry dropped. This later increased her desire to wash her hands yet again.

No Unessential Sensitive Information

School psychologists frequently encounter sensitive information. Some of it is best left out of their written reports (Braaten, 2007). That said, there is often a delicate balance. On the one hand, information pertinent to understanding students or making judgments about their entitlement needs to be reported. On the other hand, some information is inconsequential. For example, a mother who shares information about a personal history of teenage substance abuse, an interval of homelessness, or business failures/bankruptcies should not expect that material to appear among the pages of her child’s report. But if information of this type reflects a history of impulsiveness or acting out, tendencies also evident in the current student’s case, then there may be pertinence. Discretion is what matters. A general statement (skipping embarrassing details), such as “there is a positive family history for impulsive behavior,” is often sufficient.

As seen in Chapter 2, a positive family history associated with a heritable mental health diagnosis is important. This is because such a fact changes the risk of the student himself having the same condition. This fact notwithstanding, for heritability risks it generally makes no difference precisely who has been diagnosed, it is the degree of shared genes (alleles) that matters. Consequently, rather than indicating that her father had a history of ADHD or that her mother had a history of bipolar disorder, it is sufficient to simply say that a first degree relative was diagnosed. Similar judgment is required when reporting family financial troubles, marital stresses, atypical religious practices, and a host of other considerations too numerous to list.

Clarity of Sources

Novice school psychologists sometimes seek facts and then unquestioningly place what they have been told directly into their reports. A particular risk concerns background information. Consider the following case. London is a high school freshman referred because of “erratic behavior,” limited ability to keep friends, coupled with inconsistent test scores, especially in his English class. Based on a conversation with London’s mother, the school psychologist added statements to his Background Information section. Here is part of what the beginning professional wrote: “London liked school until his freshman year. At this point, his interest declined. Much of this was due to lack of resources in his English class. Specifically, the classroom lacks proper heating. Equally troubling, an offensive odor from a nearby diesel generator sometimes confronts students when they are trying to study. The situation has been brought to the attention of London’s English teacher, but he has responded with indifference.”

Inadvertently, the school psychologist has prepared a report for which conjecture has been written as if it were documented fact. What’s more, the conjecture points an accusing finger at whomever is responsible for the maintenance and safety of the school as well as the professionalism of a teacher. Indeed, the building may prove unsafe and poorly maintained and London’s English teacher may be utterly cavalier about student concerns. Someone’s assertion of these things—and this is the key point—however is insufficient to make them true. Don’t make the mistake of writing potentially false statements. It’s fine to report what has been learned, just attribute statements to their source. Here’s a rewrite. “According to his mother, London liked school until his freshman year. She went on to indicate that at this point, his interest declined. She also made the following troubling assertions: London’s English class is thought to lack resources. This includes absence of proper heating together with the presence of an offensive odor coming from a nearby diesel generator. London’s mother stated that the situation has been brought to the attention of his English teacher who was said to be indifferent.”

It should be obvious that a school administrator might react negatively to the first narrative but might accept the second. The same would probably be true of London’s English teacher. Sometimes unpleasant material needs to be written in reports. But try not to let the mere presence of distasteful assertions turn into the appearance that you are endorsing their veracity.

“People First” Phrasing

Although labels are generally seen as a necessary part of school psychology practice, they should be used circumspectly in written reports. You might be able to recognize objectionable wording embedded in the following descriptions. “The Developmental and Family Questionnaire completed by Leigh Anne’s mother provided valuable information. This included two gifted cousins, as well as a diabetic brother and an alcoholic uncle.” Notice the same information conveyed via different phrasing. “The Developmental and Family Questionnaire completed by Leigh Anne’s mother provided valuable information. Leigh Anne reportedly has two cousins who have been identified as eligible for gifted designation, a brother with diabetes and an uncle with alcoholism.” The key elements of the rephrasing concern Leigh Anne’s sibling and uncle. The order has been shifted so that diabetes no longer defines Leigh Anne’s brother nor alcoholism her uncle. A subtle shift in conceptualization may follow–problems with glucose (diabetes) are one aspect of her sibling and problems with alcohol are now one aspect of her uncle. The people first notion places the noun or pronoun first (denoting people), and the characteristic second. Other examples of people-first wording are:

  • “The LD student” becomes “the student with LD”
  • “The schizophrenic” becomes “the teenager with schizophrenia”
  • “The dyslexic” becomes “the child with dyslexia”
  • “The autistic girl” becomes “the girl with autism”
  • “The epileptic” becomes “the person with epilepsy”

People first wording is widely endorsed (e.g., the Publication Manual of the American Psychological Association, 6th Edition, APA, 2010). Most professionals find that people first wording becomes second nature in report preparation as well as in their speech with just a little practice. You are encouraged to check your own speech and writing for this way of conveying information. It’s humane and considerate.

Some Final Tips

There are many other potential things to keep in mind as you seek to refine your report preparation skill set. Only a few are mentioned here. You might consider working with colleagues and supervisors to develop your skills overtime.

Practice Using a Checklist for Skill Development

You have heard about the value of checklist recurrently in this book. You might consider using the one provided in Figure 14.2 one your own or with a supervisor. Many of the point have been discussed above.

Figure 14.2 Checklist to improve the quality and thoroughness of written reports

Be Realist and Consider These Two Helpful Sources

One of the legendary disconnects of graduate school is students’ self-assessment of their technical writing contrasted with their professors’ assessment. In other words, students commonly think they write pretty well, and their professors commonly disagree. Incidentally, this was just as true when your current professors were themselves graduate students as it is today. All this means that is easy to overlook how difficult it is to write technically and, generally, how little most students have been trained to do so. Part of your journey of becoming a competent school psychologist is to refine and continually develop your technical writing skills. There are two great resources that can help. The first is Chapter 3 (Writing Clearly and Concisely) and Chapter 4 (the Mechanics of Style) from the Publication Manual of the American Psychological Association (2010). The second is the classic, but tiny, volume entitled Elements of Style (Strunk & White, 2009).

Find Word Processing Shortcuts

Years ago, school psychologists dictated reports that were then transcribed by support staff. Few psychologists, in the 1980’s for example, keystroked their own reports. The logic was professional time was more valuable (costly to the school district) than that of transcriptionists. Then personal computers and word processing programs appeared. Paradoxically, many school psychologists started preparing their own documents without a transcriptionist’s involvement. Many said they felt more comfortable preparing reports on their own. Indeed, over time the option for dictation and transcription has largely vanished.

Figure 14.3. Although typing your own reports may seem comfortable, the practice can digress into time consuming tedium. Look for shortcuts. Photo by Andrew M. on Unsplash.

If one takes a step back, however, the logic of olden days is probably just as compelling today. Less time spent on report preparation equates to more time for direct services. Support staff is cheaper than professional staff. But since there may no longer exist an option for old-school dictation, what can be done? One option is to use voice recognition software. Although apt to engender anxiety upon first use, these modern programs (which are quite user-friendly) come to feel progressively more comfortable with each use. They are infinitely faster and much less exhausting than typing every letter of every word of every page that comprises a lengthy report. For many school psychologists, a sequence of successive approximations is the vehicle for ultimately adopting word processing. For example, a school psychologist might begin by dictating the most mundane and least variable parts of a report. Proficiency here would lead to expansion into more open-ended portions of the report, and so on. Alternatively, initial attempts at dictation might be preceded by preparation of an extremely detailed outline, which is then closely followed during dictation. Later, the outline may become more abbreviated. Still later, only a very rough outline might be prepared in advance of dictation.

For those who never move toward dictation per se, there are other possibilities. For example, many school psychologists already have developed shortcuts that depend on word processing. Standard statements have been prepared and these are cut and pasted as necessary. Others retain a list of helpful words and phrases that can be accessed as needed. Because so much time is devoted to report preparation, any procedure that speeds the process is likely to pay big dividends. You are encouraged to talk about practice efficiency with your colleagues and to share with one another strategies for efficient report preparation.

Prepare Your Report as Soon as Possible

As discussed in Chapter 2, it is often challenging to organize and make sense out of diverse assessment information. Whether practitioners devise a system of their own for earmarking documents or outlining their contents, it can be tough to keep track of things. Consequently, the more information retained in recent memory the easier the process. Scrutinizing a Connors Comprehensive Behavioral Rating Scale printout coupled with interviews that happened within the past two days is easier than those that happened within the past two weeks. Compiling a report in the former instance is easier than in the latter. This prompted Braaten (2007) to suggest that reports should be prepared as promptly as possible after information is collected.

Borrow Words from Others

Learning the proper phrases and developing a sizeable social-emotional vocabulary takes time. To speed the process and refine your ultimate skill set, borrow from others. One potential source is Braaten’s (2007) book on report writing. It offers precise words and phrases related to students’ history, behavioral observations, reactions to testing, home environments, school environments, as well as students’ reactions to various activities. Regarding some of these topics, adjectives are provided on a continuum in Braaten’s book (almost like a pull-down menu). Another source of descriptive terms is colleagues. Realistically, the typical seasoned practitioner has developed a tailored social-emotional vocabulary honed over a period of years. When you notice a useful term or an apt phrase used by a colleague, jot it down for further use. It’s pointless to reinvent the wheel. According to the 19th century playwriter and intellectual Oscar Wilde, “Imitation is the sincerest form of flattery…”

Summary

Many considerations for effective oral communication that you learned in the last chapter also apply to report writing. And, of course, there are other  associated with technical writing per se.  As with oral communication, report writing is most effective when it offers a coherent message centered around the student while simultaneously avoiding excessive details about tests and their scores. Checklists completed after meetings and following report preparation can help school psychologists speed development of their communication skills.

License

Social-Emotional Assessment in Schools Copyright © by David L. Wodrich. All Rights Reserved.

Share This Book