Category Archives: technology

Is It So Hard to Believe?

There are many times when I wish that I was still in the classroom. It seems like there are so many great opportunities to engage students in new ways of thinking and learning. I imagine an exciting classroom space where kids could write and create and collaborate, where my Nancy Atwell style reading and writing workshops could move beyond the walls of the classroom, encouraging students to pursue and share their ideas and passions with the world.

OK, take a deep breath…I was just beginning to get a good rant going about this post from Richard Byrne about the new Student AR app for Google Glass. I went back to click on the press release and discovered that the whole post was an April Fool’s Day joke. Phew!

So, now what am I going to write about? How about the fact that I believed it in the first place? Throw in names like Bill Gates and Salmon Khan and is it so hard to believe that they are busy creating an app that takes the teacher out of the game of assessment? It isn’t so far fetched. The Hewlett Foundation sponsored the Automated Student Assessment Prize, designed to encourage development in the area of automated assessment, and EdX has created discern, automated scoring software. At least one researcher is busy showing that the computer can grade as well as a person and much more quickly.

The article from University of Akron about the work of Dr. Mark Shermis is interesting and a little ironic. Perhaps the writers should have used the software to avoid the grammatical error in this paragraph:

The study grows from a contest call the Automated Student Assessment Prize, or ASAP, which the Hewlett Foundation is sponsoring to evaluate the current state of automated testing and to encourage further developments in the field.

Did you find the mistake? “Call” should be “called.” I would also suggest that the communications and marketing department should refrain from calling their website the “news” room since this is obviously a press release. It makes passing reference to critics of the research study but doesn’t dig too deeply into the controversial nature of automated scoring. Lucky for us, The New York Times takes news a bit more seriously and describes the real criticism of the grading software: it can be fooled. Les Perelman, the retired professor from MIT who launched a petition against adopting such software, takes great pleasure in both critiquing the research AND gaming the system.

Those who criticize Perelman point out that the purpose of the software is to provide instant feedback to students so they can learn to be better writers. The final product will be read by a real person. So, what of that instant feedback? Karin Klein’s daughter found that the software was more confusing than helpful. And, Barbara Chow, from the Hewlett Foundation and quoted by the University of Akron, seems to undermine that very argument. Automated scoring will mean more writing on tests and less human grading:

“Better tests support better learning,” says Barbara Chow, education program director at the Hewlett Foundation. “This demonstration of rapid and accurate automated essay scoring will encourage states to include more writing in their state assessments. And, the more we can use essays to assess what students have learned, the greater the likelihood they’ll master important academic content, critical thinking, and effective communication.”

It turns out that fact checking is exactly what the software doesn’t do well. It is checking for basic structure and grammar rather than knowledge or critical thinking. As an adjunct for several universities, I laughed out loud at Perelman’s argument for why higher education is so expensive:

“The average teaching assistant makes six times as much money as college presidents,” he wrote. “In addition, they often receive a plethora of extra benefits such as private jets, vacations in the south seas, starring roles in motion pictures.”

Dr. Perelman received a top score for this well designed argument. Oh, if the computer scoring software could only make it so.

I hope you have a great April Fool’s Day…try not to be taken in as I was by jokes that border on truth.

An Important Five Minutes

Let’s just start with the most interesting thing of all: Noam Chomsky has a Facebook page. And an active one at that with lots of  interesting and intellectually demanding content to explore.

Then, let’s move to this five-minute interview with Chomsky in which he speaks from an historical perspective when it comes to technology. Yes, we are experiencing amazing changes, he says, but they pale in the light of past changes. His first example is startling: the move from the sailing ship to the telegraph. Messages went from weeks and months to moments. A sharp intake of breath in the recognition that we may not be living in the most interesting of times.

His remarks on education make the all important point that it isn’t the technology and it isn’t even the scholarship that are important. The innovative thinkers are able to identify what is significant and use it as a frame for all the rest. Helping our students define a frame to use is an essential part of helping them access and learn from the Internet.

In poking around Chomsky’s FB page, I discovered that he has connections with Alice Walker, who writes about her image of Howard Zinn and Noam Chomsky playing tennis. It brings her hope for the world.

Reminiscing

I’ve been tinkering with the web since the late 1990s. In October 2001, as part of a grant project, I started a monthly newsletter that included resources for teachers. I’ve left them up as an archive but am not actively updating them. Every so often, I get an email from someone who has found a broken link and has suggested links for me to use. I got such an email this morning. The writer pointed to a link in the December 2002 newsletter. It turns out about half of the links are broken. I’m in the process of updating the whole website and may end up taking them down.

Except they provide an interesting snapshot of what was going on in the web. Google images was a relatively new feature. And I was already benefiting from the work of Tim Stahmer, linking to his top 100 websites.

I clicked on that link and discovered that Tim has one of the most helpful and elegant 404 pages I have ever encountered. And also that he predates me on the web by just a couple of years.

I’m glad I was there near the beginning along with people like Tim. Having the long view helps put all the new, “earth shaking” changes in perspective.

For the record, I am in the middle of a website overhaul. There are still some gems on my site but they are hard to find and everything just needs reorganized and brought into my wordpress installation.

It’s Not All Black & White When It Comes to Fertilizer

I was not expecting the first paragraph of Bill Gates’ plan to save the world to focus on fertilizer. Or really the whole first page. Turns out he is a little obsessed with it:

I am a little obsessed with fertilizer. I mean I’m fascinated with its role, not with using it. I go to meetings where it’s a serious topic of conversation. I read books about its benefits and the problems with overusing it. It’s the kind of topic I have to remind myself not to talk about too much at cocktail parties, since most people don’t find it as interesting as I do.

He finds its fascinating as an invention that has had a positive impact on human life, likening it to the polio vaccine.

Let me reiterate this: A full 40 percent of Earth’s population is alive today because, in 1909, a German chemist named Fritz Haber figured out how to make synthetic ammonia.

I bristled a little when I read this. As a part-time farmer who lives in the Chesapeake Bay watershed, I am biased towards hating synthetic fertilizer. Run off causes algae blooms that shade the sun and create huge problems for aquatic life.

But Gates does mention the problems of overuse and he seems like someone who would be interested in environmental concerns so I felt like I needed to do some follow up. What I found out was what I seem to continue to find out about most issues: there is no black and white answer when it comes to fertilizer. Certainly, in our country is it overused probably because it is cheap and readily available and we have decided that green lawns are an object of desire. But in developing countries with poor soil, according to Hunger Math, fertilizer can increase crop yields and that could mean the difference between life and death for those farmers who are raising the food to feed their families.

In other words, we shouldn’t deny artificial commercial fertilizer to the developing world merely out of a concern for the environment. Organic food production may result in healthier food and lower impact on the environment, but the needs of the hungry outweigh those values. First, feed the world.

They go on to suggest that the use of fertilizer might actually be “good” for the environment because by allowing each acre to produce more food, less land will need to be farmed:

The 150 million ha that would need to be fertilized, for one crop only per year, to end world hunger, is only about 10% of the total agricultural land. If we could obtain 2 fertilized crops per year from that land, we would only need to fertilize 5% of the agricultural land.

This all makes sense and feeding people should certainly be a priority. Of course, using fertilizer is only one of many potential solutions to be explored for alleviating world hunger, but if it can save lives, that IS more important than environmental impact. But, only to a point. Synthetic fertilizers do harm the environment. So, as with most of these kinds of sticky problems, we need to find the middle way. Using fertilizers where they can make a real difference but also being sure to help farmers learn sustainable techniques so that they can move in the direction of more earth-friendly agriculture.

Beyond learning something about world hunger, my little foray into fertilizer was a reminder of how much information we have available to us. When we wonder about something, we don’t have to live with that wonder until we can get to a book or talk to some expert. Instead, we can fact check Bill Gates on the spot. Gates doesn’t provide any footnotes so it’s up to us to figure out the truthiness of what he is writing. The exercise required close reading on my part, a focus of Common Core, and then the ability to frame my question, search for answers, and evaluate the sources providing those answers. Part of that evaluation was understanding that two of my sources–Hunger Math and Organic Valley–have their own biases that swing them to one side of the fertilizer question. The answer to “Is fertilizer good or bad?” is very similar to the answer to “Are charters schools good or bad?” or “Are interactive whiteboards good or bad?”: it depends.

 

Whose Idea Was It?

If I have seen further it is by standing on the shoulders of giants. (Sir Isaac Newton)*

Two articles in the New York Times this morning describe people who were able to take a good idea and make something out of it. They differ, however, in the way each person deals with the recognition for their work. Both articles are biographies of a sort: Nick Bilton describes the beginnings of Twitter while Margalit Fox presents the life of Ruth Benerito who helped make wrinkle-free cotton. What these stories have in common is that often the person credited by history with the creation was not the original creator just the one who took it farthest or managed to tell the best story.

Here’s how it played out for Dr. Benerito, who was inducted into the National Inventors Hall of Fame in 2008, for what is considered a significant development of our time:

Many news articles over the years have described Dr. Benerito as the sole inventor of wrinkle-free cotton, a distinction she repeatedly disavowed. In the shorthand mythologizing to which the media can fall prey, “permanent press” seems to have been a convenient hook on which to hang her many achievements in less readily understood areas of chemistry. Her demurrals, in polite Southern tones, were widely ascribed to modesty.

In reality, wrinkle-free cotton first appeared in the 19th century, developed by a Shaker community in Maine. In the 20th, many scientists contributed incrementally to the problem of persuading cotton, constitutionally crease-prone, to lie down and behave.

Benerito worked with colleagues to develop the chemical processes and she never claimed full credit:

In a 2004 video interview produced by the U.S.D.A., Dr. Benerito reiterated that wrinkle-free cotton, like so much else in science, was the product of many hands over time.

“I don’t like it to be said that I invented wash-wear, because there were any number of people working on it, and there are various processes by which you give cotton those properties,” she said. “No one person discovered it or was responsible for it. But I contributed to new processes of doing it.”

The developers of Twitter are not quite so magnanimous. Bilton’s story is one of out-sized egos attempting to develop the most compelling creation myth with Jack Dorsey taking the most credit, suggesting he was thinking about Twitter as a young boy of eight years old:

In dozens of interviews, Dorsey completely erased Glass from any involvement in the genesis of the company. He changed his biography on Twitter to “inventor”; before long, he started to exclude Williams and Stone too. At an event, Dorsey complained to Barbara Walters that he had founded Twitter, a point she raised the next day on “The View” with Stone and Williams. Dorsey told The Los Angeles Times that “Twitter has been my life’s work in many senses.” He also failed to credit Glass for the company’s unusual name. “We wanted to capture that feeling: the physical sensation that you’re buzzing your friend’s pocket,” he told the paper.

Dorsey’s story evolved over the years. He would tell Vanity Fair that the idea for Twitter went back to 1984, when he was only 8 years old. A “60 Minutes” segment reported that Dorsey founded Twitter because he “was fascinated by trains and maps” and how cities function. Later, he would explain that he first presented the idea, fully realized, on a playground in South Park. All along, Dorsey began casting himself in the image of Steve Jobs, calling himself an “editor,” as Jobs referred to himself, and adopting a singular uniform: a white buttoned-up Dior shirt, bluejeans and a black blazer.

In many ways, it was Jobs who set the standard for knowing a good idea when he saw it and then having no problem taking credit for it:

In Walter Isaacson’s biography of Steve Jobs, Jonathan Ive, Apple’s head of design, recalls how Jobs occasionally hit upon his ideas. “He will go through a process of looking at my ideas and say: ‘That’s no good. That’s not very good. I like that one,’ ” Ive told Isaacson. “And later I will be sitting in the audience” — during a product presentation — “and he will be talking about it as if it was his idea.”

Bilton believes this is part of the Silicon Valley process: everyone knows the ideas are collaborative, but venture capitalists and journalists love a good genius story. These creation myths undermine the importance of collaboration in the development process. Researchers who publish in academia, however, highlight collaboration, often including a long list of author on the paper with the first author being recognized as the primary developer. It may not have always been a happy collaboration but at least everyone gets some credit. As we work to encourage collaboration with our students, learning how to share credit is an essential lesson. We want them to be more like Dr. Benerito.

*In the spirit of this blog entry, I should point out that Newton was not the originator of this quote.