College is over. Not today, and not next week, but it’s time has come. The evidence is simple and easy to decipher. Consider this:
The U.K. offices of Ernst & Young have announced they will stop requiring degrees, but instead will offer online testing and search out talented individuals regardless of background. Why? They say there is no correlation between success at university and success in careers.
Maggie Stilwell, EY’s managing partner for talent, said the changes would “open up opportunities for talented individuals regardless of their background and provide greater access to the profession”.
Why would E&Y do this and why do I believe this is just the tip of the iceberg? Last month, the US Department of Defense awarded my company a large contract to train Cyber Attackers and Defenders. The Pentagon official in charge who awarded this contract, told me that at the start of World War II, the US knew it needed to train 10,000 fighter pilots quickly. Do you think they were concerned about whether their future pilots had college degrees?
Today, he told me, “we need to train 10,000 cyber attackers”. Is he concerned that these attackers have college degrees? Quite the opposite. He has talked with many cyber attackers in the U,S. and has hired them to help us instead of trying to be destructive. He has found that most of them were in Special Ed in school and many are on the autism spectrum. He cares about whether they like to break into things, and if they can delve deeply and patiently into complex puzzles. They don’t need to know algebra. They don’t even need to know how to program.
Can you learn cyber security in a university? You certainly can’t major in it. You will find a few courses in it in some colleges, but you can be pretty sure that the professors will not have been hackers themselves.
The truth of this can be seen when examining most any university program anywhere. I used to be the Chairman of the Computer Science Department at Yale. So, I was amused when two years ago the computer science undergraduates there had a mini-revolution, complaining that Google wasn’t hiring Yale CS graduates. The reason was clear enough. The faculty, many of whom were still there from my day, are, in essence, theoreticians. They may know how to program but they don’t really do it anymore and they want to teach about their new ideas and their latest theories.
I didn’t want to do anything different when I was a professor. I was worried about Artificial Intelligence not programing. So, I kept my best programmer with me whenever I moved, and he the taught my students to program. He was not even a professor but all Yale students from that time will tell you he was the best programming teacher they ever had. When I moved to Northwestern he came with me. I forced Northwestern to make him tenured faculty. But he has never been promoted in the 25 years he has been there. Why? He doesn’t do research, he just builds things and teaches students how to do that. Right now he is building an AI mentor that is a critical part of training 10,000 cyber attackers and defenders. Will he publish his work? He doesn’t care. Maybe we will write something together. But this is no way to succeed in the academic world and no way to train students. Students are cajoled into majoring in history or literature with no job prospects in sight. Even in Computer Science the faculty has little interest in preparing students for the real world.
The situation is worse than you think. The Chairman of the Economics Department at Columbia told me that Calculus is a requirement for their majors. When I asked why, he responded that since there is no business major at Columbia, most of the kids who came to NewYork as a way of entering Wall Street soon discovered their only path was majoring in Economics. The Department didn’t want to deal with a large volume of students so it created a Calculus requirements as way of getting rid of the less intelligent ones.
Faculty in general are not interested in the pragmatic concerns of their students. I once proposed a job related course of study when I was at Yale and was told by President Bart Giamatti: “we don’t do training, Roger.” (I responded that we did do training at Yale. We trained professors.)
Students buy into this idea at first. I once addressed the Freshman at Yale (the various Chairs were advertising the advantages of majoring in their respective departments.) I said only one thing: “major in Computer Science, get a job.” I was booed by the Freshman class. (All of the CS graduates that year (1981) went to work at Microsoft and are all quite wealthy today I assume.) But Yale’s tradition is to train intellectuals (who typically came from wealthy families.) The world has changed, but Yale not so much.
Last winter my cousin sent her grandson to me for advice on what college he should attend. I asked him wanted out of life, and out of college. He said he wanted to start his own business (again a practical subject not typically taught in college because the faculty have usually never started a business.) And, he said, he wanted a college with lots of “rah rah.” We settled on one of the big state schools that is very good academically. I spent this Thanksgiving with his mother and asked how he was doing. She replied that he was having a tough time because he hadn’t gotten into the fraternity he wanted to get into.
He will not learn how to start a business in college, but he will have fun. (His school had a pretty good football season.) When the President of the U.S. says everyone needs to go to college, all he is really saying is that the high schools have failed and college is the only way you will learn to think at all. When he says “everyone needs to learn to code” I start to wonder. Can Mr Obama code? Why does he care?
But, others care, and although I think the experiment with coding “bootcamps” has been a failure, this is not because learning to code won’t get you a good job.
Here is what people are saying about these bootcamps;
Is it really possible to become a highly employable developer in just a few months? It certainly sounds that way if you look at the Facebook ads. A new coding bootcamp at Rutgers, for instance, says that you can “become a developer in just 24 weeks,” while Full Stack Academy, a for-profit startup, goes seven better–just 17 weeks–and has a 97% job placement rate to boot. But if you have no prior coding experience, and are looking for a well-paid computer-engineering job, you should be wary of such offers. The attractions of these courses are obvious. Software jobs routinely pay some of the highest wages in the country, and founders are often heard complaining about the “battle for talent.”
What’s more, the U.S. government has joined wealthy investors in supporting these shops. Now, it’s as easy to attend a coding bootcamp as it is to go to any other college.
But coding bootcamps are starting to garner skepticism and for good reason. They’re still very new: the oldest have been out in the field starting careers for a few years at most. There’s also an increasingly vocal group of people who say that they cut corners, and that they can’t possibly impart in just a few months the skills that a coder needs to be effective on the job.
I agree that the time spent in them is too short. I also don’t like their methods. (You can’t learn to program from listening to a lecture.) But more and more we will begin to see these competitors to colleges and the colleges will not be able to adapt. Colleges are run by tenured faculty after all, and they don’t want change. Being a professor at a top tier school means hardly teaching at all. (I taught one quarter course every other year a Northwestern.) You work with PhD students and talk with colleagues and run around the world being famous. Undergraduates are not on your mind. (It is a really good job and no one wants to really have to work at teaching.)
Where will we get our cyber security professionals? Here is a report I found:
CYBERSECURITY BUSINESS REPORT
The cybersecurity workforce shortage -- which has 1 million job openings in 2016, and is projected to reach 1.5 million by 2019 -- is especially acute at hospitals and healthcare providers, according to one industry expert.
"Healthcare IT projects are being consumed with runaway EHR (electronic healthcare records) projects" says Bob Chaput, a healthcare information risk management and compliance expert, explaining the main reason he sees for the lack of qualified cyber staff at hospitals. "Secondarily, healthcare leadership has been slow to prioritize and fund cyber programs" adds Chaput, who is CEO at Clearwater Compliance, a cybersecurity firm with a specialty practice geared to helping healthcare CIOs and CISOs.
We will have to open cyber security “bootcamps” as well. (And data analytics bootcamps.) Soon there will be many of these programs that are intended to produce professionals, and these professionals will get hired if they can show what they have produced in the way of work products.
Gradually, college will go back to what it always was: “a four year vacation funded by my parents” as one of my Northwestern students told me. But that means that only rich people will go to college. If and when college is made free, then you and I will be paying for their vacation. Kids who want jobs will go to bootcamps. (I sure hope they change that name.)
Or, we could try making high school more pragmatic and job oriented. (Of course that will never happen as long as Common Core is the rule and the Testing companies own education.)
Either way, our 3000 colleges will turn into 100 colleges soon enough. I am all for Harvard and MIT being places where cutting edge research is done and where one can learn to be a researcher. But no matter how many times we say STEM, the fact is that we have more researchers than we need at the present time. Colleges, that is those that are not in the top 100, will need to adapt, or they will die.
"Or, we could try making high school more pragmatic and job oriented. (Of course that will never happen as long as Common Core is the rule and the Testing companies own education.)" Your most salient point.
ReplyDelete