Writing solid and meaningful abstracts for computer science (CS) education research is a key factor in getting your important research used by others. After all, there is an end goal for publishing research–research is meant to be shared and used as a building block for future research as well as for practice.

Although the field of K-12 CS education research is still relatively new, other fields have been publishing research for many decades. These research communities have also struggled with similar issues and have already adopted standards and practices for reporting on research, including the development of guides for writing abstracts. These practices continue to evolve as the community needs shift.

The American Psychological Association (APA) has published guides for creating abstracts across several types of study designs. In general, they recommend including objectives, participants, study method, findings and conclusions/implications (as shown in the table below).

Abstract guidelines from the American Psychological Association.
Objectives State the problem under investigation, such as the main hypothesis
Participants Describe subjects or participants, specifying their unique attribute for this study
Study method Describe the study method, including:

  • Research design
  • Sample size
  • Materials used (e.g., tools, software)
  • Outcome measures
  • Data-gathering procedures
Findings Report findings, including effect sizes and confidence intervals
Conclusion and/or Implications State conclusions and implications


This is a great summary of what to include in an abstract, but practically, how can you do this?

Let’s talk about the exceptions first. Many research papers are unique. Maybe it’s a position paper and the authors are trying to build a case for why their position is valid. Or, maybe another paper didn’t involve an intervention or other form of data analysis. Why is this uniqueness of the paper important? Each type of paper may require some nuance and a little more thought about how to construct an abstract so that a reader is not left wondering what the paper entails.

Outside of those exceptions, the majority of CS education papers papers report on studies that include an intervention or study of some sort. For those, we recommend the use of a structured abstract. Structured abstracts are abstracts that follow a standard, predetermined format. Some structured abstract templates cover exactly what the APA JARS recommends reporting.

An example of a structured abstract is A Pair of ACES: An Analysis of Isomorphic Questions on an Elementary Computing Assessment by Parker, Garcia, Kao, Franklin, Krause, and Warschauer (2022):

Background and Context. With increasing efforts to bring computing education opportunities into elementary schools, there is a growing need for assessments, with arguments for validity, to support research evaluation at these grade levels. After successfully piloting a 10-question computational thinking assessment (Assessment of Computing for Elementary Students – ACES) for 4th graders in Spring 2020, we used our analyses of item difficulty and discrimination to iterate on the assessment.

Objectives. To increase the number of potential items for ACES, we created isomorphic versions of existing questions. The nature of the changes varied from incidental changes that we did not believe would impact student performance to more radical changes that seemed likely to influence question difficulty. We sought to understand the impact of these changes on student performance.

Method. Using these isomorphic questions, we created two versions of our assessment and piloted them in Spring 2021 with 235 upper-elementary (4th grade) students. We analyzed the reliability of the assessments using Cronbach’s alpha. We used Chi-squared tests to analyze questions that were identical across the two assessments to form a baseline of comparison and then ran Chi-Squared and Kruskal-Wallis H tests to analyze the differences between the isomorphic copies of the questions.

Findings. Both assessment versions demonstrated good reliability, with identical Cronbach’s alphas of 0.868. We found statistically similar performance on the identical questions between our two groups of students, allowing us to compare their performance on the isomorphic questions. Students performed differently on the isomorphic questions, indicating the changes to the questions had a differential impact on student performance.

Implications. This paper builds on existing work by presenting methods for creating isomorphic questions. We provide valuable lessons learned, both on those methods and on the impact of specific types of changes on student performance.

An example of a structured abstract that slightly modified the structure is Practitioner Perspectives on COVID-19’s Impact on Computer Science Education Among High Schools Serving Students from Lower and Higher Income Families by McGill, Snow, Vaval, DeLyser, Wortel-London, and Thompson (2022). In this article, the authors chose to use a structured abstract, but chose to include more of their findings, so space prohibited them from adding implications to the abstracts.

Research Problem. Computer science (CS) education researchers conducting studies that target high school students have likely seen their studies impacted by COVID-19. Interpreting research findings impacted by COVID-19 presents unique challenges that will require a deeper understanding as to how the pandemic has affected underserved and underrepresented students studying or unable to study computing.

Research Question. Our research question for this study was: In what ways has the high school computer science educational ecosystem for students been impacted by COVID-19, particularly when comparing schools based on relative socioeconomic status of a majority of students?

Methodology. We used an exploratory sequential mixed methods study to understand the types of impacts high school CS educators have seen in their practice over the past year using the CAPE theoretical disaggregation framework to measure schools’ Capacity to offer CS, student Access to CS education, student Participation in CS, and Experiences of students taking CS.

Data Collection Procedure. We developed an instrument to collect qualitative data from open-ended questions, then collected data from CS high school educators (n = 21) and coded them across CAPE. We used the codes to create a quantitative instrument. We collected data from a wider set of CS high school educators (n = 185), analyzed the data, and considered how these findings shape research conducted over the last year.

Findings. Overall, practitioner perspectives revealed that capacity for CS Funding, Policy & Curriculum in both types of schools grew during the pandemic, while the capacity to offer physical and human resources decreased. While access to extracurricular activities decreased, there was still a significant increase in the number of CS courses offered. Fewer girls took CS courses and attendance decreased. Student learning and engagement in CS courses were significantly impacted, while other noncognitive factors like interest in CS and relevance of technology saw increases. Practitioner perspectives also indicated that schools serving students from lower-income families had 1) a greater decrease in the number of students who received information about CS/CTE pathways; 2) a greater decrease in the number of girls enrolled in CS classes; 3) a greater decrease in the number of students receiving college credit for dual-credit CS courses; 4) a greater decrease in student attendance; and 5) a greater decrease in the number of students interested in taking additional CS courses. On the flip-side, schools serving students from higher income families had significantly higher increases in the number of students interested in taking additional CS courses.

So far, we’ve talked about how meaningful abstracts can be structured. We now turn our attention to a simplified process for writing an abstract that uses the following steps:

Tips for Writing a Meaningful Education Research Abstract

  1. Write your paper first.
  2. Set up the structure for your abstract. We recommend choosing an abstract template similar to the above examples.
  3. For the research problem, identify one or two key sentences in your introduction section that highlight the research problem. Copy and paste those sentences into the Research Problem section of your abstract. Don’t worry about length or readability right now.
  4. For the research question, identify your question(s) and add them to the Research Question(s) section.
  5. For each of the other sections, look in the corresponding sections of your paper and identify the key sentence(s) from each. Once you find them, copy and paste those sentences into the appropriate area of the abstract. Don’t forget to include the population studied and the number of participants where appropriate.
  6. Review what you’ve written and start editing your abstract, modifying it so that it fits the abstract word count requirements for your particular publication venue where it will be submitted.
  7. Check your final abstract with the APA (or other) abstract guidelines to make sure that your abstract still meets the guidelines for best practices.

Sometimes you need to write an abstract first and submit that. If the abstract is accepted, then you have a green light to write your paper. In these instances, starting with an outline for a structured abstract can help guide its development and help ensure that it has sufficient details for reviewers to accept your abstract.