OK fine. it's a provocative title. But hear me out.
Most non-cave-dwelling luddites have heard about the new Apple tablet (aka IPad). The simplest (and most misleading) way to describe it is as a gigantic ipod touch, with all the multitouch goodness of the ipod/iphone, as well as the numerous app store apps.
There's a vigorous debate going on over the merits and impact of the IPad, and while it's clear that it's not a work laptop/netbook replacement, it's probably the first internet appliance with some mojo.
The word 'appliance' is chosen deliberately. The IPad essentially behaves like a completely sealed off appliance - you can't hack it or customize it directly, and are only allowed the interface that's provided to you by Apple and the app store (also controlled by Apple). This is viewed (correctly, on many levels) as a feature, and not a bug. After all, most people don't care to know how their large and complicated computers really work, and all they want is a way to check email, surf the web, watch movies, etc etc.
But here's the thing. As long as the computer has been this complicated, hard to manage and yet critically important device, it's been easy to make the case for computer science as an important, lucrative discipline, and one worth getting into. Even in the past few years, with enrollments plummeting (they seem to be recovering now), there's been no argument about the importance of studying computer science (even if it comes across as boring to many).
And yet, how many people enroll in 'toaster science' ? More importantly, how many people are jumping on the chance to become automotive engineers ? As the computer becomes more and more of an appliance that we "just use", the direct connection between the person and the underlying computing engine go away.
Obviously there'll always be a need for computer scientists. Those social networks aren't going to data mine themselves, and someone needs to design an auction for pricing IPad ads. But it's quite conceivable that computer science will shrink dramatically from its current size down to a much smaller discipline that generates the experts working in the backend of the big internet companies (Microsoft, I'm not optimistic about your chances of survival).
This cuts both ways: a smaller discipline with more specialized skills means that we can teach more complex material early on, and are likely to attract only the dedicated few. However, it'll be a "few": which means that for a while, till we reach a new stable equilibrium, there'll be way fewer jobs at lower salaries.
I make this observation with no great pleasure. I'm an academic and my job might disappear within 10 years for other reasons. But along with rejoicing in the mainstreaming of what appears to be fairly slick use of technology, I'm also worried about what it means for our field as a whole.
You make a very interesting point and it is worth really thinking about.
ReplyDeleteBut I have a question about how you think a device such as the iPad will actually change people's usage of a computer. Most CS people do not use their computer to just surf the web (hopefully). Therefore, they may not be able to survive just with an iPad. Other people who are addicted to, say, facebook may be happy with an iPad. But these people probably were using their very powerful computer to just surf the web/send email/go on myspace anyway. So it's probably energy efficient that they don't use such a big computer for just surfing the web.
I guess my question is that the iPad may be "simplifying" the computer into a iPad device, but already most people were not taking full advantage of their computer. How many people write programs on their computers? Most don't. So the iPad is taking the current situation and making it more efficient for most people, i.e. not wasting resources by giving people computers more powerful than what they need.
Excellent point.
ReplyDeleteHere's how I see this playing out. You're absolutely right that the IPad will merely make it obvious what people really use a computer for. CS folks will still need their high powered computers, and the rest weren't using the power of the computer anyway.
But while everyone was working on the same device (even if they weren't using its fullest potential), the computer itself as a relevant object of study was front and center. We joke about the 'Oh you study computer science - can you fix my computer' jokes, but at least they allowed the general population to understand what computers were all about.
Take that away, with the ipad, and now the computer goes away, and it's the internet that comes front and center. So if you're doing 'internet science' of some form or the other, great. If not, watch out. Because it's these casual users that vote, write to their congresscritters, run for office, get elected, and make policy regarding matters like funding, regulation and what not.
That's what I'm worried about.
(even if it comes across as boring to many).
ReplyDeleteIf people start associating computers with cool multi-touch devices rather than with the frustration and boredom of getting printers to work, I think that would entice people to become computer scientists.
And yet, how many people enroll in 'toaster science'? More importantly, how many people are jumping on the chance to become automotive engineers?
Did mechanical and electrical engineering get studied less once their products such as toasters and cars became ubiquitous and relatively reliable?
Nice post Suresh!
ReplyDeleteOne of the basic problems with computers -- and one that CS has done depressingly little to address -- is that computers suck. Of course I don't mean they suck at computing, but rather that they suck to use. However, as you have obviously noticed, appliances that happen to contain computers (including cars, dishwashers, and many other devices) need not suck.
So anyway, the appliance has to be the way forward. The present situation, where administration of a Windows/Linux machine is a near-impossible task for most users, is untenable.
First, the mathematical side of CS can persist as long as the study of math does. The IPad is still Turing Complete, and there aren't Toaster Scientists because there's no interesting "theory of toasting"
ReplyDeleteSecond, applied CS is much more about software than hardware. As long as people are getting excited about new applications or games, they'll perceive the need for somebody to be developing them, whether they get them from the web or an app store.
The 'hackability' of computers drew a lot of us into this field, so we may lose numbers in future generations. But this would lead to the opposite problem of what you're predicting: too many jobs for the CS grad, not too few.
In theory all the interesting applications might be written some day, and then we would need to shrink as a field. But the IPad seems unrelated to that question, if anything 150k IPod apps suggests we're still finding new niches to fill. Add in robotics/AI and there are opportunities for CS to influence ever greater number of fields.
People don't care about CS b/c computers are complicated; they care b/c computers are cool---full of innovations. (And cars generally are not.) The iPad doesn't change this, but will help integrate computers more into our daily lives. That can only make them more interesting.
ReplyDelete(A) Number of people doing CS is a function to a large extent of the market place. The demand for CS grads is not going to go away.
ReplyDelete(B) I think people care about the overall impact of a field. I doubt iPad is the end of innovation with impact on daily life.
(C) As programming is going to enter all fields of science (as it already did), the need for CS will increase. The real question is how CS avoids the faith of Math...
(D) And who will write the applications for the iPad?
(E) Not completely relevant - There are similar products to iPad based on Android coming out. It is quite conceivable that apple would not be the market leader 2-3 years down the line.
----
I do think there is a problem however. As computers become slicker it is harder and harder to see the connection between CS and the products people use...
--S
Would you say CS is less interesting just because so much computing power is spent on downloading Facebook pages? If so, would you say psychology and neurology are less interesting just because so much brain power is spent reading Facebook pages? Lots of things are interesting because they have great potential, even when what they're actually most often used for is wasting time.
ReplyDeleteYes, nice post.
ReplyDeleteBut I think Desktop applications have been dead for years, and years.
The iPad is just the realization that most people don't need to install software applications and, indeed, shouldn't.
I sure don't want my father to go out and install software on his PC!!!
But you touch a finer point: nobody wants to study toaster science. Damn good point.
I actually quite like the idea of people doing degrees in "Internet Science". What could be cooler than studying information in all its forms? And, given that it's tricky to get on the Internet without a computer, those internet scientists will be stuck with learning a lot of computer science too...
ReplyDeleteSeems to me you are confusing raw numbers (of people becoming computer scientists) with a shift in the field from a 'science' to a 'trade'. (To continue your analogy to automotive engineers: once automobiles became standardized, there was surely a shift from 'people who could design and build cars' [i.e., 'engineers'] to 'people who can fix cars' [i.e., 'repairmen']. But the number of auto repairmen is still huge.)
ReplyDeleteUnfortunately, this was already the way CS was heading well before the ipad, with most undergraduate CS majors not much better than glorified programmers. Or, more to the point, I don't see how the ipad itself accelerates or affects this process.
How is calling the iPad a giant iPod Touch misleading?
ReplyDeleteTo me the description "giant iPod Touch" makes it sound like you use it for the same sorts of things you'd use an iPod but it is much more comfortable to read on and much less comfortable to carry. Are you suggesting that is somehow wrong?
Oh, and a question for one of your anonymous commenters: what is the faith of math that should be avoided? (Even if you meant "fate", which I hope you didn't, I'd still ask what "fate" you meant.)
as a programmer once famously said:
ReplyDelete"computer science is as much about computers as astronomy is about telescopes". slick telescopes and backyard astronomy did not kill astronomy or funding for space research, nor will OLPC or ipad kill CS.
CS is Magic for real world, consider a world which is completely wired, CS knowledge would be like knowing magic!
ReplyDeleteHowever we must ensure our ability to perform the magic by keeping standards open unlike the APPLE approach, if you own a microprocessor you should have full right to run code on it.
Well, I do not agree with you... It think that closed software and software architectures are only one trend among numerous trends we see out there.
ReplyDeleteOpen source is an example. I like to describe open source as "Have it your way". While with a closed architecture you can ensure that everything the company thought of just works, you'll always find many people who don't believe the provided softwares fits their needs. Developing open source software will always require open source developers and as long as the open source source software will fill a gap in the global software industry, as long as we will have the need for developers. And I believe that this gap will always exist since the human kind is a perpetual unsatisfied animal and since the free lunch is always welcome.
Another example would be the very promising fields of service oriented architectures and cloud computing. Software designers will focus on their tiny area of expertise and try to make it right. People will be able to choose their service provider as they choose their phone company and actually build their system the way it fits their needs.
It is no secret that Apple have always been a closed system evangelist. They have been consistent all the way except for the Intel based machines move but they seem to be correcting their mistake...
Sure there is a revolution, there are many of them. But IMHO, most of them will make computer science the central point of our modern world.
Anis
You are using "computer scientist" and "application developer" interchangably which of course is very erroneous. Of all computer scientists, application developers make up maybe less than 1%.
ReplyDeleteIf people start associating computers with cool multi-touch devices rather than with the frustration and boredom of getting printers to work
ReplyDelete