Cross-posted from Making Sense of Science.
An issue that has been on my mind recently is the question of whether Computer Science at tertiary level adequately prepares students for work in the real world. And secondly, whether that is really the point of Computer Science at all.
Today, I was in a tutorial for one of my CompSci courses, and while waiting for the tutors to arrive I was chatting away with the student next to me. A few minutes into our conversation, I asked him, “So what do you think you want to do when you graduate?” In hindsight, I was probably putting him on the spot a bit—after all, most of us only have a faint idea about what we really want to do.
He replied, “I’m not sure,” and then added, “The stuff we learn is only really basic.” And he’s right. The content taught in Computer Science, at least up until Stage II which I’m currently at, is often challenging in the sense that there are new concepts to grapple with and material to commit to memory. But even still, it is not nearly enough for students to begin building even basic software applications.
I accept that some concepts and terminology, although abstract and not often used outside of an academic setting, simply need to be learnt. And I don’t for a minute think that universities should stop teaching courses about algorithms, data structures, graph theory, and other fundamentals. But at the other end of the spectrum, going from theory to practice, I wonder whether teaching at the practical end of Computer Science could be somehow improved.
The way I see it, much of Computer Science simply does not lend itself well to the “lectures, assignments, mid-term, exam” format that most university courses tend to follow.
Other than the minority who continue on with Computer Science as academics, most students will go on to be software developers (sometimes called software engineers). In such occupations, the actual work will probably involve the entire development process, from planning, design, and construction to the on-going maintenance of software. Most students—as our lecturers often point out—will never have to implement the algorithms which we’re taught about in class. They will just import a library, or modify someone else’s implementation.
In my experience the subject of Computer Science does a good job of covering the fundamentals—the theory behind the code—which is critical if students are to solve problems effectively and efficiently. But what is missing is more of the practical side—for example, how to use version control systems, how to design a software application from the given specifications, and most importantly, how to work collaboratively with other developers.
I don’t believe that, in order to make improvements in these areas, universities necessarily need to radically alter their Computer Science programmes, or even introduce a raft of new courses. Even within existing courses, I think students could benefit from assignments and examples that better relate to reality, even if only a very simplified version of it.
Rather than being asked to manipulate text strings to demonstrate a particular concept, I believe students would benefit more, both in terms of their understanding of the concept and their long-term success as developers, if such problems were placed into some real-world context.
In addition to providing more context, I also see a place for more project-based courses that put the skills which students will need as developers—such as an understanding of usability, testing, collaboration, and software design—into practice. For example, students might be tasked with developing a working piece of software, whether it be a machine learning programme, smartphone app, or some form of augmented reality widget, using the tools and techniques of real-world developers.
As you can probably tell, these thoughts are somewhat unstructured as they stand. You could argue that there are many opportunities, such as hackathons and student internship programs, that give students a taste of real-world software development already. But often they are run by private companies looking to scoop up grads, and may give skewed picture of the industry.
However, I’m prepared to be corrected. So what do you think? If you’re a student or someone involved in the subject, do you agree that Computer Science as a subject should have more of a practical focus? Or do you believe it’s perfectly fine the way it is?
Here are a few additional links related to the topic: