IT and the Ivory Tower

The perennial problem with teaching technology in an academic setting is that by the time cutting-edge applications reach the classroom, they’re often past their prime. IT professionals blame the distance between the ivory tower and the real world. The teachers “are mostly full-time professors and have no link to industry,” explains Pieter Dorsman, Amsterdam-based product marketing manager for T-Systems, a 100% subsidiary of Deutsche Telekom.

Educators respond that it’s not fair to tar all institutions with the same brush. “I think there are regional colleges and universities that keep pace better than others,” says Stuart Wasilowski, vice president of Workforce Development at South Piedmont Community College in Charlotte, N.C. “This is due in part to the funding formulas, business demand for trained workforce and most importantly, leadership personnel that allows this progress. When these stars line up, you have the possibility for an institution to be ‘up to speed’.”

Principle versus Practice

Much of the debate about the right way to prepare students for the real-world needs of enterprise IT centers on a fundamental question: Is it better to teach critical thinking or practice applications?

“Higher education’s purpose is to teach principles,” says San Francisco-based Greg Skinner, a former Alta Vista Principal Software Engineer and M.I.T. graduate. “It cannot keep up with technology, and it isn’t in its best interest to try. The best that it can and should do is to provide as strong a foundation as possible so that students can apply it to current technologies and possibly develop new technologies.”

But what about enterprise’s need to have fresh graduates on the ground running from day one? “There will always be a delay in adopting newer technologies in universities, unless they create the technology themselves and it becomes a market standard immediately. The question is: Is this really important?” says Robert De Loght, IT management consultant/owner, RDL Consult in Brussels, Belgium. “I think that companies have the permanent responsibility to pay attention to training their personnel. Prior education just paves the way to absorb even more material.”

“Training students on one particular tool or programming language is meaningless because technology changes so rapidly,” agrees Wuchun “George” Shen, director, Business Intelligence Consulting at AnswerThink in Boston, Mass. “Higher education should instead teach students problem-solving and critical-thinking skills, scientific and innovative approaches and methodologies.”

Building Blocks or Bulldozers?

But should universities limit themselves to providing the building blocks? Or should they bulldoze their way to the frontlines?

“I think universities will be forced to hire or work in collaboration with third-party e-learning and instructor-led firms — or perhaps the companies that actually develop the products — that are more easily able to adapt and train [students in] new technologies,” says Steve Johnson, e-Learning Designer at Pacific Life Insurance in Los Angeles, Calif.

“If they don’t, I think the online degrees that people used to scoff at may actually become the best resource for the latest technology training, while [traditional] universities continue to lag further and further behind, graduating students with outdated skills by the time they receive their diploma,” he adds.

Universities are well aware of the challenge and are doing their best to respond. “We’re not up to speed universally across our whole campus yet, but we’re getting there,” says Ray Miller, Adjunct Assistant Professor, College of Applied Science — MET, University of Cincinnati. “With 36,000 students, it is a challenge.”

Academia Doesn’t Pay

It may be unrealistic to expect universities to ever match their enterprise peers. “Universities are large bureaucratic organizations that have very limited resources. With technology changing so rapidly, they just don’t have the budget to buy the latest hardware or software technology as it emerges — particularly because it’s always more expensive when it first comes out,” says Johnson. “Since they don’t have the budget to purchase the latest technology, they don’t have the ability to develop a curriculum around it, and they can’t teach it.”

Nor are commercial companies likely to lend a hand in sharing cutting-edge technology. “Research is no longer funded when it does not have some kind of predictable short- or medium-term return. As a result, most discoveries are made in commercial companies that share their developments no sooner than when they are certain they can make money with it,” explains Dorsman.

Dorsman notes that proprietary interests prevent him from releasing information on his own research developments. “If I were doing the same [research] at a university, I would have published at least five white papers over the past five years,” he says. “Sadly, no university would fund this.”

The fundamental fact is, while the mind is not limited by resources, academic institutions are. Expecting colleges and universities to replace employee training entirely is an outdated notion. It is not even possible to impart all the latest knowledge in a specific domain.

Instead, says De Loght, “higher education should prepare people for flexible and permanent, life-long learning. Even if you spend five years or so at university, you still have 40 years to go in your professional life. It would be better to focus on the 40 years.”