Share and discuss this blog



Saturday, April 6, 2013

Artificial Intelligence. Neuroscience. Education Technology, and the misreporting of science by The Times and others




Reading the newspapers about new technology is a lot like going to a fortune teller to find out about the future. Nice stories, but the reality is unknown. Here are the first three paragraphs from a recent New York Times article on computers that can give a grade to a college essay:


Imagine taking a college exam, and, instead of handing in a blue book and getting a grade from a professor a few weeks later, clicking the “send” button when you are done and receiving a grade back instantly, your essay scored by a software program.

And then, instead of being done with that exam, imagine that the system would immediately let you rewrite the test to try to improve your grade.
EdX, the nonprofit enterprise founded by Harvard and the Massachusetts Institute of Technology to offer courses on the Internet, has just introduced such a system and will make its automated software available free on the Web to any institution that wants to use it. The software uses artificial intelligence to grade student essays and short written answers, freeing professors for other tasks.







Sounds great doesn’t it? Better service for students, less work for professors and smart computers, all in one article.  Except that is all nonsense. The Times doesn’t say the software is AI but most every other paper printing the same story did. Here is the headline from the Denver Post for the same article.

New artificial-intelligence system grades essays at college level

We live in a time where every new piece of technology in education is touted as a great breakthrough. Now, AI is my field and my specialty in AI is processing language. No computer can read an essay. Maybe someday, but not now. So, how they grade them if they can’t read them? By counting how many big words were used? By seeing if the sentences are grammatical? The articles that tout the glories of this stuff never actually say how. But, no computer can tell if the writer had a good idea. This is actually very hard to determine. It is the reason why well educated professors would take on the task of reading an essay -- to see if there were any good ideas in it. But now that we have MOOCs and everything  is about mass education, why bother? No one is listening to anyone’s ideas anyway. Just tens if thousand of students hearing the same lecture. Yet, The Times and other papers keep touting MOOCs as a great breakthrough. They even had the audacity to mention that professors would now have time free to do other things. What other things? Their lectures are already recorded and they don’t grade papers, so what others things?

The answer that any professor can tell you is research. All these MOOCs, essay grading software, and everything else we are hearing about, is meant to allow professors to teach less and do more research. No problem with that. I had that view of the world too when I was a professor. The students get shortchanged by this and will get really shortchanged by MOOCs, but neither The Times nor the faculty of elite institutions care much about students.

The Times doesn’t care much about the truth either, at least when it reports about scientific breakthroughs. This is not unique to the Times however. Here another report from the same day that also hit the press about science:

Scientists 'read dreams' using brain scans




This time it was a BBC headline, but many other papers reported the same scientific breakthrough. The scientists quoted in the reports did not say anything like this of course. The scientists said that they can now detect images in the brain for some people whom have they studied. The sleeping person is awakened and asked what he was dreaming about and the scientists can detect a similar pattern when it occurs another time. Hardly “reading your dreams.”  

Newspapers like to make stuff up and people remember the nonsense they read in a headline. So the public thinks that computers can read and understand an essay and the public thinks that the computer can read your dreams. So what if this isn't even close to true? Another paper sold. 

One wonders if the scientists aren’t complicit in all this nonsense. The answer is yes and no. I am interviewed all the time and I know that the reporter will exaggerate what I said and write a ridiculous headline. I do the interviews anyway on the grounds that some good might come out of it. But many scientists want people to think they are doing stuff that they actually aren’t doing. This is particularly true of artificial intelligence, my own field, where the experts quoted in the Times article must have known full well how their work would be misinterpreted and didn't care.

Scientists are always selling so that people will get excited and give them more money to do research. And newspapers are always writing headlines that aren’t true but catch your eye.

The public loses by being misinformed. At the moment it is is being misinformed about education in a serious way. Things in education are not improving. Technology is not helping (although it could.) Things in education are getting much worse. Let’s see if the Times ever says that.

1 comment:

Mentifex said...

The mainstream media (MSM) have no idea what is really going on in artificial intelligence.