On “Learn to Code”
Should industry be allowed to dictate our school curriculums?
The learn to code movement has gained a lot of traction. Where children once used to say they want to be a popstar or footballer, they may as well now say programmer. Although it is not yet up there with lawyer or doctor, organisations like code.org are trying to change that, and large corporations are putting significant amounts of money into lobbying and outreach groups like Google CS First, CoderDojo, Black Girls Code, Girls Develop It, and companies like Codecademy and Khan Academy.
It’s not hard to understand why companies like Facebook and Google want everyone to learn to code. When everyone can code, the wages for programmers can be kept in check. Silicon Valley firms are often complaining that there’s a talent shortage, and it’s desirable to get the government to train people for free rather than having to invest in their own staff training schemes and apprenticeships.
In the United Kingdom, “computing” was added to the national curriculum in September 2014, after Ian Livingstone wrote a report saying not only is the gaming industry the fastest growing industry in the UK, but also that they are suffering a severe talent shortage. “We felt that the education system was not meeting the needs of our industries” he wrote in the introduction to the report. In the report, they talk about the importance of mobile phone games and 3D blockbusters. Quite early on they say that the problem is with the pipeline:
Yet, the sad truth is that we are already starting to lose our cutting edge: in just two years, it seems the UK’s video games industry has dipped from third to sixth place in the global development rankings. Meanwhile, the visual effects industry, though still enjoying very rapid growth, is having to source talent from overseas because of skills shortages at home. That is mainly a failing of our education system – from schools to universities – and it needs to be tackled urgently if we are to remain globally competitive.
The tech industry is known for being homogenous, mostly white and male. Focusing on increasing the pipeline is the easy way out; just increase the number of people in the pool rather than examining hiring bias, or addressing the culture that makes so many women and minorities leave their tech jobs. Instead of making tech culture fit more people, the “learn to code” movement seems to be about delivering more “culture fit” people.
Instead of talking about learning to code, we should be talking about coding to learn
The learn to code movement is also somewhat tangled up with the maker movement, and very focused on producing things. The US government, of course, wants Americans to start making things again. Making things on a computer is a cheap way of making, and so far the learn-to-code movement has mostly focused on producing games and ecards. At Code Club, a volunteer after-school club started in the UK, the first two semesters is about making games, but only in a shallow way: “write this code to make this effect”. There is nothing about what makes games fun, why people play games at all, and the psychology and behavioural science behind it. In term 2 you at least get to design your own game rather than follow instructions, but again, you’re just making drawings. It teaches nothing about playtesting or improving your game.
If we solely focus on making things, we are missing out. Imagine if English class was just about producing short stories. You would not read other stories, just write your own. You would not analyse other stories or even critique your own. Never learn about composition, just churn out more stories. Ian Livingstone would probably approve of the Code Club curriculum. Writing code based on some instructions to produce unexamined games would surely equip these kids with the right skills for game factories. Won’t somebody think of the economy!
But I think that selling computing as being of vocational relevance directly to the software industry is a mistake. Only a small proportion of students will end up in the software industry. Skills learnt in computing are applicable to work across various industries, but less so if we let the software industry dictate the curriculum.
The greatest tragedy of introducing computing into schools was setting up separate computer labs where they took the kids away from their normal classroom for special lessons. I strongly feel that computing should be integrated into all lessons. Learning mathematics could be much better when learning through programming; looking at primary sources in history could be more fun when examining the raw data yourself, scraping huge amounts of data and asking the right questions to make sense of it. Make your own worlds to simulate laws of physics. Making computer art should not be separate from art class. The computer is the greatest tool we have, and instead of learning to code we should be focusing on coding to learn.
I am concerned that teaching computing as a separate subject is missing the point of what computers could do in education. To quote Mitch Resnick, creator of Scratch:
When we learn to read, we also acquire a new way to learn. After learning to read you can use your new skill to acquire knowledge about various other subjects by reading books. It’s the same with learning to code. When you learn to code you can use your new skills and code to learn.
Should Industry be Allowed to Dictate Our School Curriculums?
We should also be examining how technology is changing society. Not just how to program, but how it is used for various purposes. There is a world of difference between what computers can do and what society chooses to do with them.
In August 2012 Eric Schmidt, CEO of Google, accused the UK of throwing away its computing heritage, finding it concerning that the country that invented programming and the electronic computer did not teach computer science in school. The UK did, however, have a ICT (Information, Communication and Technology) curriculum, which could easily have been made by Microsoft: kids learned to use word, excel and Powerpoint, useful office skills, but not enough. The Education Secretary, Michael Gove, said it was dull, demotivating and harmful, and decided to start over. Computing was introduced in Schools as a compulsory subject in September 2014, following countries like Estonia and Finland.
But should industry be allowed to dictate our school curriculums? Was it not industry who got us in this mess in the first place? Promising free computers for schools fully equipped with Word, Excel and PowerPoint, Microsoft lobbied to have their software taught in all schools. And when Gove said the current curriculum was demotivating, harmful and dull, he was referring to learning these office skills. Most criticism of the ICT curriculum said it was teaching kids to be secretaries. But getting the Industry to write the new curriculum is somehow now cool?
I recently took an online course intended to prepare ICT teachers for the new Computing curriculum. A history of computing video tells us about “BT character” Tommy Flowers, how BT (British Telecoms) shaped the country, how BT pioneered fibre optics… Unsurprisingly the course turned out to be sponsored by BT. Undoubtedly we can learn a lot from BT, but a course designed by BT is unlikely to tell us the whole story. Likewise, a computing course sponsored by Google is unlikely to tackle topics like corporate mass surveillance, even though that is their business model. We want to equip people not only with coding skills but also with the ability to analyse and criticise how technology is shaping society. If industry sets the curriculum, will there be room criticism? [Full disclosure: I had to resign as director of Code Club, a company I founded, for publicly criticising a sponsor’s involvement in corporate mass surveillance and another sponsor’s production of military weapons]
I want the conversation to change and be about the importance of open data, holding governments accountable, privacy and surveillance, freedom of speech and expression. We should be talking about how to solve civic problems, using the tools we have available. Sometimes this will involve a computer. But the important bit is not the use of the computer, but asking the right questions and finding the right problems to solve.