Archive: 2023

Emerging Promising Practices for CS Integration

Comments Off on Emerging Promising Practices for CS Integration

Our recently accepted paper, Emerging Practices for Integrating Computer Science into Existing K-5 Subjects in the United States, will be presented at WIPSCE 2023 in Cambridge, England. 

This particular qualitative work, conducted by Monica McGill, Laycee Thigpen, and Alaina Mabie of CSEdResearch.org, included interviews with researchers and curriculum designers (n=9) who have engaged deeply in K-5 CS integration for several years. Their perspectives were analyzed and synthesized to inform our results.

Several promising practices emerged for designing curriculum, creating assessments, and preparing teachers to teach in a co-curricular manner. These include ways for teachers to vary instruction, integrating into core (and oft tested) language arts and mathematics, and simplifying assessments. Many of the findings are borne from the need to help new teachers become comfortable teaching a new subject integrated into their other subjects.

Generally, promising practices that emerged included adopting Universal Design for Learning practices, include ways for teachers to take the curriculum and vary instruction to fit their comfort levels as they learn to teach integrated CS, and co-design lessons with teachers. They also suggest capitalizing on integrating into language arts since it is a highly-tested and critical subject for learning.

Figure 1. General findings.

For more specific findings, the experts suggested integrating focusing on fractions for math, leveraging cause and effect in science to teach conditional logic, and reflecting upon how language is similarly used in English and in computing.

Subject Integration Findings across ELA, Math, Science, and Social Studies. For ELA, 1) use games and other tools and 2) reflect upon how language is used in English and in computing. For Math, go heavy on computational thinking, use virtual manipulatives, focus on fractions, and enhance learning with other tools. For science, leverage cause and effect in science with conditional logic. For social studies, incorporate cultural holidays into CS.

Figure 2. Subject specific findings.

You can read the full paper (including our methodology and profiles of our experts) here.

Monica M. McGill, Laycee Thigpen, and Alaina Mabie. 2023. Emerging Practices for Integrating Computer Science into Existing K-5 Subjects in the United States. In The 18th WiPSCE Conference on Primary and Secondary Computing Education Research (WiPSCE ’23), September 27–29, 2023, Cambridge, United Kingdom. ACM, New York, NY, USA, 10 pages. https://doi.org/10.1145/3605468.3609759 (effective after September 27th, 2023).

This material is based upon work supported by Code.org. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of Code.org.

We acknowledge and thank Brenda Huerta for her assistance with the literature review.

Conducting High-quality Education Research in Computing that is Designed to Support CS for All

Comments Off on Conducting High-quality Education Research in Computing that is Designed to Support CS for All

Join us on Wednesday, March 20th, 2024, from 1-5pm PST, in Portland, Oregon, United States (the day before ACM SIGCSE) for a workshop on conducting high-quality, equity-enabling education research in computing!!!

Register using this form.

This event will be for computer science education researchers who want to learn more about:

  • Characteristics of high-quality education research,
  • How to conduct research that meets these characteristics, and
  • How to center the participants and their lived experiences throughout the research process.

The workshop will be held Wednesday, March 20th, 1-5pm PST at ACM SIGCSE Technical Symposium as an affiliated event*.

Participants will learn about the guidelines and associated resources, discuss their application to current or proposed research projects, and gain a new appreciation for how to embed equity perspectives in each phase of their research. Specifically, participants will develop personal positionality statements and improve their ability to write research questions, use theoretical frameworks, and develop instruments and protocols.

This interactive workshop will be geared towards those studying computing education and who want to learn more. We welcome submissions from those with any level of education research. For this particular event, graduate students will be prioritized for the limited spaces available.

This presentation is supported by a National Science Foundation grant. U.S. citizens, nationals, and permanent residents will receive a $150 stipend for participating.

Facilitators for this event will include:

  • Monica McGill, CSEdResearch.org
  • Sarah Heckman, North Carolina State University
  • Leigh Ann DeLyser, CSforALL
  • Jennifer Rosato, National Center for Computer Science Education
  • Isabella Gransbury, North Carolina State University

This workshop is based in part on guidelines for conducting education research that were created during a 2023 ITiCSE workshop (McGill, M. M., Heckman, S., Chytas, C., Diaz, L., Liut, M., Kazakova, V., Sanusi, I. T., Shah, S. M., & Szabo, C. Conducting High-Quality Equity-Enabling Computing Education Research. (Working Group Report).(accepted with revisions)

*This event is an in-person event only. We are aware that attending in-person is not feasible for all researchers. Therefore, future hybrid events and additional resources are being planned to meet the needs of all researchers and will be shared as they become available.

For questions about this event, please email [email protected].

Register using this form.

“But They Just Aren’t Interested in Computer Science” (Part One)

Comments Off on “But They Just Aren’t Interested in Computer Science” (Part One)

Written By: Julie Smith

Note: this post is part of a series about the most-cited research studies related to K12 computer science education.

When discussions about the lack of women in tech occur, it is sometimes observed that the disparities exist because girls just aren’t as interested in studying computer science in school and women just choose not to work in the tech industry. 

This sentiment is horribly misleading. It is true that research shows that girls and women are, on average, not as interested in studying or working in computing. But what is important to understand is that interest isn’t like eye color: it’s not an inherent, biological attribute that simply reflects human diversity. Rather, what we choose to be interested in is strongly influenced by what our culture conveys is appropriate for ‘people like us.’ This may seem to be an unusual way of thinking about the issue; we often assume that our interests are simply pure reflections of our personality and volition. But, at least for the case of interest in computer science, the evidence suggests otherwise.

In “Computing Whether She Belongs: Stereotypes Undermine Girls’ Interest and Sense of Belonging in Computer Science,” Allison Master, Sapna Cheryan, and Andrew N. Meltzoff describe two experiments which show how (dis)interest in computer science can be influenced by very simple interventions. They created photographs of stereotypical (think: Star Trek posters) and non-stereotypical computer science classrooms and showed them to high school students and asked which classroom they would prefer. Girls were significantly more likely to express interest in the course in the non-stereotypical classroom. (Boys’ interest was not impacted.). In their second experiment, the researchers provided participants with a written description of a computer science classroom, some stereotypical and some not. Again, girls  were much more interested in a course in the non-stereotypical classroom. 

These two experiments are important because they show that interest in computer science isn’t hard-wired. Rather, it appears to be strongly influenced by whether computing is presented as conforming to stereotypes that aren’t as welcoming to girls. For those of us concerned with the negative effects of the lack of women in tech – not just on the women themselves but on a society that is ever-increasingly shaped by technology – these results are good news because they show that relatively simple interventions can increase girls’ interest in the study of computing.

Further Reading

Shout out to our interns

Comments Off on Shout out to our interns

Last week was National Intern Day, which gave me another reason to reflect on the many students I’ve worked with over the last seven years contributing to the K-12 CS Education Research Resource Center on our site.

The Resource Center has been under development since 2017. Originally funded under a National Science Foundation grant and now from Amazon Future Engineer, we owe much to the many interns who have worked on our project over the course of this time. This is true whether they provided support to our mission over one semester or three years. I’m also thrilled to say that I was fortunate enough to publish several articles together with one-third of our interns in an effort to engage them in computing education research. (Why, yes, that was my attempt to bring them to the dark side!)

It’s really incredible and I am personally grateful for their contributions and camaraderie. I’m also thankful that so many have stayed in touch with me after graduating and starting their post-college careers.

So, HUGE SHOUT OUT to all of you! And many, many thanks from me and a grateful research community who still uses your contributions today.

Monica McGill, President & CEO, CSEdResearch.org

 

Media

  • Emily Nelson, Undergraduate Student, Bradley University (current)

Data Curation (2017-2023)

  • Alia Saadi El Hassani, Undergraduate Student, Knox College
  • Alaina Mabie, Undergraduate Student, Bradley University
  • Arsalan Bin Najeeb, Undergraduate Student, Knox College
  • Ava Lu, Undergraduate Student, Knox College
  • Bishakha Awale, Undergraduate Student, Knox College
  • Bishakha Upadhyaya, Undergraduate Student, Knox College
  • Brenda Huerta, Undergraduate Student, Bradley University
  • Emily Schroeder, Undergraduate Student, Knox College
  • Jessica Potter, Undergraduate Student, Bradley University
  • Joey Reyes, Undergraduate Student, Knox College
  • Ma’Kiah Holliday, Undergraduate Student, Rochester Institute of Technology
  • Olivia Lu, Undergraduate Student, Bradley University
  • Sarah Wallenfelsz, Undergraduate Student, Knox College
  • Sean Mackay, Graduate Student, University at Buffalo
  • Shebaz Chowdhury⁺, Undergraduate Student, Knox College
  • Tavian James, Undergraduate Student, Knox College
  • Zachary Abbott, Undergraduate Student, Bradley University

Software Development (2017-2023)

  • Bishakha Upadhyaya, Undergraduate Student, Knox College
  • Hung Vu, Undergraduate Student, Knox College
  • Momin Zahid, Undergraduate Student, Knox College
  • Nate Blair, Graduate Student, Rochester Institute of Technology
  • Nhan Thai, Undergraduate Student, Knox College
  • Thu Nguyen, Undergraduate Student, Knox College
  • Trang Tran, Undergraduate Student, Knox College

 

⁺ Deceased

Learning Isn’t Observable. So How Do We Measure It?

Comments Off on Learning Isn’t Observable. So How Do We Measure It?

Written By: Julie Smith

Note: This post is the second in an occasional series about learning analytics, based on the Handbook of Learning Analytics.

We measure learning constantly – think of grades on spelling quizzes, SAT scores, and the bar exam. But it’s worth remembering that learning itself cannot be observed in the same way that the growth of a plant can: when it comes to learning, we have to make decisions about how to measure something we can’t see. Gray & Bergner (2022) outline the choices that must be made when educational researchers operationalize a learning construct – such as a sense of self-efficacy, the ability to work productivity in a group, or subject matter knowledge.

First, the decisions: Gray & Bergner present a very helpful distinction between measurements designed to understand a construct and those designed to improve a construct. They aren’t the same: we might understand something (for example, that students who spend more time on discussion boards earn higher grades) without being able to improve it (if we direct students to spend more time on discussion boards, they may spend less time reviewing for an exam and get a lower grade).

Next, Gray & Bergner review the strengths and weaknesses of three kinds of data that can be used for educational measurement. Validated and reliable surveys exist, and they are easy to administer at scale. But they may suffer from various biases related to self-reporting. Trace data (such as keystroke data from students using an educational technology learning platform) can be gathered unobtrusively, but it can be difficult to draw conclusions from it. Text data can be a rich source of insight into a student’s thought processes, but training either machine learning models or humans to assess and code such data is tricky. 

Other pitfalls exist as well. One challenge with many forms of data is that it doesn’t capture change over time. The authors point out that previous research shows that “cycles between positive and negative emotions can have a positive impact on the learning process compared to maintaining a consistent emotion” (p24), which is precisely the kind of insight that can be lost from a data snapshot. Similarly, information can be lost when data is cleaned; for example, grouping students by their final letter grade may result in both overemphasizing the difference between a student who earned a high B and a low A as well as underemphasizing the difference between a student who failed with a 0 and one who failed with a 69.

Despite these challenges, Gray & Bergner aren’t discouraged about the potential for learning analytics to help understand and improve learning outcomes – in fact, their careful outline of the challenges facing various forms of data collection are a good step toward thoughtful, responsible data collection and use.

Further Reading

 

Learning Analytics: What Is It?

Comments Off on Learning Analytics: What Is It?

Written by: Julie Smith

Note: This post is the first in an occasional series about learning analytics, based on the Handbook of Learning Analytics.

Concepts that are difficult to define are sometimes compared to trying to nail Jell-O to a wall. That analogy could certainly apply to learning analytics – there’s no shortage of definitions of ‘learning’ or ways to measure and analyze it. So the response to the question ‘What Is Learning Analytics’ (Lang et al., 2022) is a welcome framing of a complex topic; the authors present learning analytics through four different lenses:

First, learning analytics is a concern or a problem. That is, modern educational methods generate big data which needs to be analyzed. Not only does that require technical skills grounded in a sound approach, but it also raises issues related to privacy, ethics, and equity. The authors point out a “tension between learning as a creative and social endeavor and analytics as a reductionist process that is removed from human relationships” (p9).

Second, learning analytics is an opportunity. The data generated by learning management systems (such as Canvas and Blackboard) has created the possibility of gaining insight into learning – particularly the process – as opposed to just the product – of learning. 

Next, learning analytics has become a field of inquiry. What distinguishes it from other uses of data to improve education? The authors point to the idea of a ‘human in the loop’ as central to the field. That is, the goal is not to replace instructors or curriculum designers but rather to provide information to augment their decision-making. As a field, learning analytics has grown exponentially since its inception a little over a decade ago.

And, finally, it is a community. Focused around the Society for Learning Analytics Research, academics, researchers, educators, practitioners, and industry representatives have formed a community of practice.

This framework for understanding learning analytics through four different lenses provides a balanced approach to the promise and peril of using big data in educational contexts. Future posts will explore various methods and applications of learning analytics.

 

Further Reading

Join us at AERA 2023!

Comments Off on Join us at AERA 2023!

Attending the AERA Conference in Chicago this month? Join us as we present at American Educational Research Association (AERA) 2023 on April 14th and 15th. 

Our panel, “Co-constructing Systemic Support for Sustaining Humanizing and Inclusive Computer Science Teacher Education”, will be presented on Friday, April 14th at 9:45 AM (1st Floor of the Swissotel Chicago, Montreux 3). Our co-panelists are Michelle Friend (University of Nebraska Omaha), Maya Israel (University of Florida), Janice Mak (Arizona State University), Amy Ko (University of Washington), and Monica McGill (CSEdResearch.org).

The goal of this presentation is to shed light on some of the equity-based issues that are relevant in the computer science education community and offer insightful solutions that can be implemented by educators and practitioners. Panelists will discuss  equitable practices in education research and teacher preparation programs. There will also be plenty of time for questions from the audience. 

We hope to see you there!

Best Paper Award to Joey Reyes, Undergraduate Intern

Comments Off on Best Paper Award to Joey Reyes, Undergraduate Intern

We’re pleased to recently learn that Joey Reyes and Monica McGill (advisor) received the best paper award at the 2023 ASEE IL-IN Section Conference held at Southern Illinois University – Edwardsville on April 1, 2023.

Joey, an undergraduate student at Knox College, presented their paper Feasibility of Using the CAPE Framework to Identify Gaps in Equity-focused CS Education Research. This paper describe a pilot test Joey and Monica conducted to determine the feasibility of using the CAPE theoretical framework to identify coverage of equity-focused CSER. The Capacity, Access, Participation and Experience (CAPE) framework developed by Fletcher and Warner examines the capacity to offer CS education, learner access to CS education, learner participation in CS education (enrollment) and experiences learners have when learning CS.

Joey Reyes Presents paper at ASEE Illinois Indiana regional conference

They started with one primary research question: How feasible is it to use the CAPE framework for identify coverage gaps in K-12 CS education research?

Then they created a secondary research question for narrowing down the set of articles examined and testing its feasibility: What are the gaps in research focused on K-12 CS education in which girls are participants in the studies?

They chose to use the Resource Center’s set of 800+ articles and examined studies in which only girls were participants (n=51), then examined each of the 51 articles to determine which key CAPE component(s) each covers. The pilot results showed that CSER among girls covers areas related to Experience (92%) and Capacity (59%), but little to no coverage in the areas of Access (0%) and Participation (2%) of girls. Within experience, coverage in some areas was much higher than others, indicating potential gaps in research.

Experience results showing gaps in research for girls (such as Persistence and Self-regulation)

To answer the primary research question and determine the feasibility of using CAPE for analyzing the entire corpus of 800+ articles, they evaluated feasibility across two key areas, implementation and practicality, and found both to be satisfactory.

Attending SIGCSE Technical Symposium 2023? We’ll be there!

Comments Off on Attending SIGCSE Technical Symposium 2023? We’ll be there!

We’ll be actively engaged in the SIGCSE Technical Symposium (TS) in Toronto.

If you’d like to learn more about the work we have recently been engaged in, be sure to stop any of our sessions. Only our workshop requires registration.

 

Day/Time Type Room Title
Wednesday, March 15, 7-10pm Workshop 713 Creating and Modifying Existing Surveys to Fit Your CS Education Research Needs (In-Person)
Ryan Torbey (AIR), Monica McGill (CSEdResearch.org), Lisa Garbrecht (University of Texas at Austin)
Thursday, March 16, 11:35am Paper Presentation 715 Growing an Inclusive Community of K-12 CS Education Researchers (In-Person)
Sloan Davis (Google), Monica McGill (CSEdResearch.org)
Friday, March 17, 11:10am Paper Presentation 701B Building upon the CAPE Framework for Broader Understanding of Capacity in K-12 CS Education K12 (In-Person)
Monica McGill (CSEdResearch.org), Angelica Thompson (CSEdResearch.org), Isabella Gransbury (North Carolina State University), Sarah Heckman (North Carolina State University), Jennifer Rosato (College of St. Scholastica), Leigh Ann Delyser (CSforALL)
Friday, March 17, 3:45pm Panel 718A Building Capacity Among Black Computer Science Educators (Hybrid)
Angelica Thompson (CSEdResearch.org), Allen Antoine (The University of Texas at Austin), Anita Debarlaben (University of Chicago Laboratory Schools), Donald Saint-Germain (University Heights Secondary School), Leon Tynes (Xavier College Preparatory), Vanessa Jones (Computer Science Teachers Association (CSTA))
Friday, March 17, 4:10pm Paper Presentation 701B Measuring Teacher Growth Based on the CSTA K-12 Standards for CS Teachers K-12 (In-Person)
Monica McGill (CSEdResearch.org), Amanda Bell (CSTA), Jake Baskin (Computer Science Teachers Association), Anni Reinking (CSEdResearch.org), Monica Sweet (University of California San Diego CREATE)