« December 2007 | Main | February 2008 »

January 29, 2008

Thinking About the New Literacy

In the report Learning in the 21st Century, writer Alvin Toffler is quoted saying, "The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn."So what will literacy look like as we go further into the 21st century? What will it mean to learn, unlearn, and relearn?

I think a recent article in Edutopia magazine is a start to addressing these questions. The article, Programming: The New Literacy compares the idea of programming to that of scribing. Author Marc Prensky reminds us that at one time written language was a skill that was reserved for only a few. When the need arose to communicate with written language, you needed a scribe to create the message and another scribe on the other end to decipher the message. How will programming be handled in the next century? Will be choose to simply pay someone else to do the job for us?

In a small way, programming is already in our daily lives. As Prensky explains, even VCR's need to be programmed. Do we do it ourselves or do we have a child do it for us? Children seem to have a natural curiosity and understanding for these tasks. They are the digital generation. Like, Toffler, Prensky notes the importance of learning one programming language and then moving on to another higher level language to accomplish tasks. This sounds like an example of learning, unlearning, and relearning.

Tell us what you think. What will literacy in the 21st century look like? How do you think Computer Science and programming fit into this idea of literacy?

Dave Burkhart
CSTA K-8 Representative

Posted by cstephenson at 02:07 PM | Comments (12)

January 24, 2008

Clarifying the Dewar and Schonberg Article

There has been quite a bit of discussion about the article by Dewar and Schonberg

http://www.stsc.hill.af.mil/CrossTalk/2008/01/0801DewarSchonberg.html

claiming that:

"It is our view that Computer Science (CS) education is neglecting basic skills, in particular in the areas of programming and formal methods. We consider that the general adoption of Java as a first programming language is in part responsible for this decline."

In http://itmanagement.earthweb.com/career/article.php/3722876 Dewar clarifies that it isn't Java that he blames so much as the "use of the Java's graphical libraries lets students cobble together code without understanding the underlying source code."

The only evidence of these claims is that they see a decrease in performance in their systems and architecture classes. They also have trouble recruiting qualified applicants who have the right foundational skills for their Ada programming company that develops mission critical software.

The biggest flaw in the article is the lack of evidence supporting the claims. How many people fail the systems and architecture classes now compared to when C++ or C was used as the introduction language? Is this a problem just at their schools or nationwide? If the introductory courses switched to Java and the follow on courses never changed to introduce concepts no longer covered in the introduction course (like pointers) then of course more people will fail. It is likely that using C or C++ in the intro course just caused more people to fail and quit after the first course instead of later. Perhaps the systems and architecture courses are being taught poorly. At Georgia Tech we found that student performance improved in low-level systems types courses when we used the context of programming for a game boy. Students today don't find low-level systems programming as interesting as they did 20 years ago, when computers weren't capable of much.

I am not surprised that they have trouble finding people who know Ada. It certainly peaked many years ago in terms of popularity. I also don't find it compelling that they want people to have more low level skills since the biggest growth is in jobs that have higher level skills (like software engineers).

One of the reasons Java is a popular language in industry is because you don't have to build everything from scratch. Good software engineers need to know how to reuse existing classes and how to design classes that can be reused. Why should students have to build their own graphics primitives instead of using the Java graphics classes? What learning do they miss out on by not doing this?

When I first took a 3D graphics course we had to develop the algorithm for drawing a line. As students we found this a boring and tedious task since even at that time all the graphics packages had algorithms for drawing a line. I very much doubt that this is required in current 3D graphics courses. Yet, the field of 3D graphics has made huge advances since then. In part we made advances in fields by not reinventing everything.

Dewar in particular claims that the introductory curriculum has been "dumbed down" to make it more fun and appealing. Again, what evidence does he give for this claim? He says that students are not learning formal methods for proving program correctness, but my understanding is that this field which was popular in the 80s has not had much success. He also claims that students don't have enough knowledge of floating point computation. Again, what proof does he give for the need for this? Students certainly need to be aware of the problems with floating point computation, but very few will go on to do mission critical low-level work.

Our research on learning computing in a context whether it be Media Computation, Matlab, or robotics has shown that it does improve student success and retention. We also have the evidence to back this up, not just at Georgia Tech, but at several other universities and colleges. Just because you make something fun or interesting doesn't mean you have "dumbed it down" or that students aren't learning what they need to in order to be successful in a career in computing.

Barb Ericson
CSTA Teacher Educaiton Representative

Posted by cstephenson at 03:06 PM | Comments (0)