Paper ID: 2304.06122
Analyzing ChatGPT's Aptitude in an Introductory Computer Engineering Course
Sanjay Deshpande, Jakub Szefer
ChatGPT has recently gathered attention from the general public and academia as a tool that is able to generate plausible and human-sounding text answers to various questions. One potential use, or abuse, of ChatGPT is in answering various questions or even generating whole essays and research papers in an academic or classroom setting. While recent works have explored the use of ChatGPT in the context of humanities, business school, or medical school, this work explores how ChatGPT performs in the context of an introductory computer engineering course. This work assesses ChatGPT's aptitude in answering quizzes, homework, exam, and laboratory questions in an introductory-level computer engineering course. This work finds that ChatGPT can do well on questions asking about generic concepts. However, predictably, as a text-only tool, it cannot handle questions with diagrams or figures, nor can it generate diagrams and figures. Further, also clearly, the tool cannot do hands-on lab experiments, breadboard assembly, etc., but can generate plausible answers to some laboratory manual questions. One of the key observations presented in this work is that the ChatGPT tool could not be used to pass all components of the course. Nevertheless, it does well on quizzes and short-answer questions. On the other hand, plausible, human-sounding answers could confuse students when generating incorrect but still plausible answers.
Submitted: Mar 13, 2023