skip to main content
10.1145/2462476.2462490acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
research-article

Evaluating student understanding of core concepts in computer architecture

Published: 01 July 2013 Publication History

Abstract

Many studies have demonstrated that students tend to learn less than instructors expect in CS1. In light of these studies, a natural question is: to what extent do these results hold for subsequent, upper-division computer science courses? In this paper we describe our work in creating high-level concept questions for an upper-division computer architecture course. The questions were designed and agreed upon by subject-matter and teaching experts to measure desired minimum proficiency of students post-course. These questions were administered to four separate computer architecture courses at two different institutions: a large public university and a small liberal arts college. Our results show that students in these courses were indeed not learning as much as the instructors expected, performing poorly overall: the per-question average was only 56%, with many questions showing no statistically significant improvement from pre-course to post-course. While these results follow the trend from CS1 courses, they are still somewhat surprising given that the courses studied were taught using research-based pedagogy that is known to be effective across the CS curriculum. We discuss implications of our findings and offer possible future directions of this work.

References

[1]
K. Asanovic, R. Bodik, B. C. Catanzaro, J. J. Gebis, P. Husbands, K. Keutzer, D. A. Patterson, W. L. Plishker, J. Shalf, S. W. Williams, and K. A. Yelick. The landscape of parallel computing research: A view from berkeley. In Technical Report Technical Report No. UCB/EECS-2006--183, EECS Department, University of California, Berkeley, 2006.
[2]
C. H. Crouch, J. Watkins, A. P. Fagen, and E. Mazur. Peer instruction: Engaging students one-on-one, all at once. In E. F. Redish and P. J. Cooney, editors, Research-Based Reform of University Physics. American Association of Physics Teachers, College Park, MD, USA, 2007.
[3]
K. Goldman, P. Gross, C. Heeren, G. L. Herman, L. Kaczmarczyk, M. C. Loui, and C. Zilles. Setting the scope of concept inventories for introductory computing subjects. Trans. Comput. Educ., 10(2):1--29, 2010.
[4]
G. L. Herman, M. C. Loui, and C. Zilles. Creating the digital logic concept inventory. In Proc. of the 41st SIGCSE, 2010.%
[5]
R. Lister, E. S. Adams, S. Fitzgerald, W. Fone, J. Hamer, M. Lindholm, R. McCartney, J. E. Moström, K. Sanders, O. Sepp\"al\"a, B. Simon, and L. Thomas. A multi-national study of reading and tracing skills in novice programmers. In ITiCSE-WGR '04: Working group reports from ITiCSE, 2004.
[6]
D. A. Patterson and J. L. Hennessy. Computer organization and design: The hardware/software interface. In Morgan-Kaufman, 2008.
[7]
L. Porter, C. Bailey-Lee, B. Simon, and D. Zingaro. Peer instruction: Do students really learn from peer discussion in computing? In Proc. of the 7th ICER, 2011.
[8]
A. Robins, P. Haden, and S. Garner. Problem distributions in a CS1 course. In Proc. of the 8th ACE, 2006.
[9]
C. Schulte and J. Bennedsen. What do teachers teach in introductory programming? In Proc of the 2nd ICER, 2006.
[10]
J. Sheard, A. Carbone, R. Lister, B. Simon, E. Thompson, and J. L. Whalley. Going solo to assess novice programmers. In Proc. of the 13th ITiCSE, 2008.
[11]
H. Sutter. The free lunch is over: A fundamental turn toward concurrency in software. In Dr. Dobb's Journal, vol. 30, 2005.
[12]
A. E. Tew and M. Guzdial. Developing a validated assessment of fundamental CS1 concepts. In Proc. of the 41st SIGCSE, 2010.
[13]
A. E. Tew and M. Guzdial. The fcs1: a language independent assessment of cs1 knowledge. In Proc. of the 42nd SIGCSE, 2011.

Cited By

View all
  • (2023)Taking Stock of Concept Inventories in Computing Education: A Systematic Literature ReviewProceedings of the 2023 ACM Conference on International Computing Education Research - Volume 110.1145/3568813.3600120(397-415)Online publication date: 7-Aug-2023
  • (2022)YODA: A Pedagogical Tool for Teaching Systems ConceptsProceedings of the 53rd ACM Technical Symposium on Computer Science Education - Volume 110.1145/3478431.3499322(613-618)Online publication date: 22-Feb-2022
  • (2021)Identifying Informatively Easy and Informatively Hard ConceptsACM Transactions on Computing Education10.1145/347796822:1(1-28)Online publication date: 18-Oct-2021
  • Show More Cited By

Index Terms

  1. Evaluating student understanding of core concepts in computer architecture

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ITiCSE '13: Proceedings of the 18th ACM conference on Innovation and technology in computer science education
    July 2013
    384 pages
    ISBN:9781450320788
    DOI:10.1145/2462476
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 July 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. assessment
    2. computer architecture
    3. curriculum

    Qualifiers

    • Research-article

    Conference

    ITiCSE '13
    Sponsor:

    Acceptance Rates

    ITiCSE '13 Paper Acceptance Rate 51 of 161 submissions, 32%;
    Overall Acceptance Rate 552 of 1,613 submissions, 34%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)21
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 22 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Taking Stock of Concept Inventories in Computing Education: A Systematic Literature ReviewProceedings of the 2023 ACM Conference on International Computing Education Research - Volume 110.1145/3568813.3600120(397-415)Online publication date: 7-Aug-2023
    • (2022)YODA: A Pedagogical Tool for Teaching Systems ConceptsProceedings of the 53rd ACM Technical Symposium on Computer Science Education - Volume 110.1145/3478431.3499322(613-618)Online publication date: 22-Feb-2022
    • (2021)Identifying Informatively Easy and Informatively Hard ConceptsACM Transactions on Computing Education10.1145/347796822:1(1-28)Online publication date: 18-Oct-2021
    • (2021)Student Performance on the BDSI for Basic Data StructuresACM Transactions on Computing Education10.1145/347065422:1(1-34)Online publication date: 25-Oct-2021
    • (2021)Proficiency in Basic Data Structures among Various Subpopulations of Students at Different Stages in a CS ProgramProceedings of the 26th ACM Conference on Innovation and Technology in Computer Science Education V. 110.1145/3430665.3456337(429-435)Online publication date: 26-Jun-2021
    • (2021)A Modular Assessment for Cache MemoriesProceedings of the 52nd ACM Technical Symposium on Computer Science Education10.1145/3408877.3432410(1089-1095)Online publication date: 3-Mar-2021
    • (2021)Identifying Student Misunderstandings About Singly Linked Lists in the C Programming Language2021 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC)10.1109/VL/HCC51201.2021.9576162(1-9)Online publication date: 10-Oct-2021
    • (2020)The Practical Details of Building a CS Concept InventoryProceedings of the 51st ACM Technical Symposium on Computer Science Education10.1145/3328778.3366903(372-378)Online publication date: 26-Feb-2020
    • (2020)Investigating the Impact of Employing Multiple Interventions in a CS1 CourseProceedings of the 51st ACM Technical Symposium on Computer Science Education10.1145/3328778.3366866(1082-1088)Online publication date: 26-Feb-2020
    • (2020)Caches as an Example of Machine-gradable Exam Questions for Complex Engineering Systems2020 IEEE Frontiers in Education Conference (FIE)10.1109/FIE44824.2020.9273822(1-9)Online publication date: 21-Oct-2020
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media