I was pointed to a blog post titled
Radical Change in Higher Education: Money, Sports and Computers. The author presents a distrust in Massive Open Online Courses (MOOCs) like Udacity, then goes on to present her ideas for deeper, more radical change.
There are things I kinda agree with but don't forsee happening, such as the ending of college sports in the way they exist now and the creation of minor leagues in their stead. I never really thought it made sense to associate institutes built upon intellect to associate so tightly with teams built on physical prowess. The minor leagues of baseball exist because the baseball is a summer sport and college students are generally off over the summer, so collegiate sports cannot attract the same audiences. Summer break exists because summer is when people were needed back at home, and the low number of people involved in agriculture makes that no longer a design requirement, so the fundamental change I would make, trimesters, would tend to make collegiate baseball more viable and minor league baseball less viable. Ah well.
Unless I misunderstood something, I believe that my school's sports program is self-supporting. I think that, if a university's program is not self-supporting, or if there isn't something else that the school gets from it that justifies it (and for the life of me, I can't think of anything, but I've never been a team sports guy), it would be better if it got rid of it.
I see the point of "replacing" collegiate sports with fitness and wellness programs, but honestly,
every personal improvement in wellness I have ever experienced has come from working with myself, not with a group. I think the group dynamic messes it up, but that might just be me.
The last point, the one I'll quote, is one I strongly agree with and disagree with.
Computer Science: CS should be required. For everyone. Can you be a historian today without using a computer? An artist? A salesperson? Anything? Shouldn’t we aspire to turn out a new generation of educated men and women who have more than a surface knowledge of how the blasted things work, since their success in no small part will depend on that knowledge?
I hold a CS degree. I work with computers in the day and play with them at night. My day work with computers involves the use of computers in science, and I've been saying this for years:
Today, all science is computer science, because the research that could've been done without computers has been done without them already. I think that the same is becoming true in other fields. Between
Processing,
Tod Machover's work and work with genetic algorithms in composing, there's precedent for the use of computational tools in the arts. I think you can still be a historian without using a computer much more than using it for email and word processing, but I've heard of historians making more interesting use of it. First, there's the wider dissemination of contemporaneous source material, but beyond that, many are beginning to see digitized libraries as a Big Data source, where you can graph the rise and fall of ideas and people by the number of occurrences of them in the text.
I'd throw in the idea that this goes down to the level of skilled labor.
Adam Davidson writes in the New York Times magazine about the "Skills Gap", saying that machinist training starts with using tools to cut metal but quickly move on to computer-aided methods.
So, yes, I'm big in agreeing that there's value in a great many fields in embracing the computer revolution. I'm all for teaching programming to all. I'm just not sure that Computer Science is really where you want that.
Computer Science is different. Computer Science, properly considered, is a branch of mathematics built around complexity. Yes, students go into Computer Science as a step toward becoming programmers, but this means there's a great deal of knowledge they gain and never use, and a great deal of knowledge they don't gain until they get on the job and find they need it. I still feel undertrained in the use of version control, for example. Those would be better served with a program built around Software Engineering, and that term is problematic, as there is no mechanism for becoming a licensed software engineer, but
are required to call yourself an engineer in other fields, and many of the greatest advances in computing come from people who have the most tenuous claim to that title. Linus Torvalds was a college student when he started developing Linux.
Consider the case of databases. I would consider that there is one set of skills where users (be they artists, historians, scientists, programmers, machinists...) might use to collect what they need, another set of skills that programmers use to effectively create the tables and data structures to be used by others, and another set of skills which programmers use to create the database engines themselves. The first set of skills are things I would wholeheartedly endorse encouraging everyone to know. The second set is a little less useful unless you're stepping up to start collecting your own data. I'm learning it a little at a time, and finding holes in my knowledge every time I create a set of tables. The third set, the skills that those at Oracle or in Microsoft's SQL Server team or involved in PostgreSQL develop, are ... I don't know enough about them to really describe. But it's more about how to make these things fast at the hardware level, so you can have several thousand transactions a second go through when under load.
Thing is, while the last category is closest, none of it is really Computer Science. I think forcing this association between computer use and Computer Science doesn't help any party involved.