Pace of Grad School

Pace of Grad School

Currently, I’m out of town attending a workshop with some very specialized people in an area in which I’m trying to make inroads. And while I won’t bore you with the details (I knew people could have such levels of nuance), I will share with you a comment that I heard over drinks.

Let me paint the picture for you a little bit first. I was having a discussion with someone very high up in the food chain about current/potential/future projects. This person asked a very specific technical question related to the problem, something that would mean significant money for the company that person represents if possible. I, wanting to hold on to my ideas, remained steadfast with a response like “Talk to me in 18 months when we have it working in the lab.” This is basically a nice way of saying like hell I’m going to let you steal my idea.

In previous situations with this type of discussion, the response is usually “well, I look forward to seeing it in person when it’s running.” Some people actually mean that while others are saying fat chance that will ever work. I expected either of those responses but got neither in this case. Rather, the response was a tirade about how academia is too slow, this technology isn’t what academia should do [and should give it to that company], and academia is too expensive to get things done. (For those of you only barely following along, that translates to we want it sooner, this technology could be a competitor if successful, and no, I won’t fund your research in this area).

I’m only going to tackle the first of those comments/tirades but I think it’s a very important discussing question. Is academia too slow and if so, what are the reasons and how can the pace be increased?

In general, I would say that academia is slow but I would not call it too slow. For instance, funding turnaround (if successful) for hearing feedback on proposals is in the 6-9 month range and then another 3 months before the money is available. Then, if you already do not have a student, you have to find a student which is time consuming. Plus, that student may be in the process of taking classes and their qualifier, all which delay the process. Thus, the time difference from when a prof is writing the proposal to when a student is working full time in the lab could be 2-3 years. That is slow. Obviously, if you already have a student on a similar project or they are done with classes, that timeline is shorter.

But is that really too slow? In engineering, especially from a corporate perspective, that is too slow. If the prototype is not 3-6 months away, then it is not something on their radar. But because companies are often shortsighted, they lack the ability to recognize quality technology in its infancy where they could offer a partnership or buy the technology outright, instead of scrambling to compete with a new spinoff/start-up. They are then reduced to paying over the top for the technology if they choose to go that route.

With that said, the pace of funding proposals could be increased (ideally) but this is not likely to happen. Also, I do think some classes are a waste for student. Often, because graduate course offerings are limited, students have no choice but to take certain courses which have no bearing on any topic related to their research or a topic of interest. When that is the case, research (and learning through research) is clearly a better option. Reducing the number of classes also means students will be able to get in to research sooner or do research concurrently with classes. (FYI, most European PhD programs have no mandatory classes. It seems to work fine for them.)

What about you? If you’re a grad student, do you feel the pace is too slow/fast? For those of you with both corporate and grad school experience, is that a fair assessment?


I think the thing with academia and most research projects is: It doesn’t matter how fast it is. If it’s as leading edge as I assume most research projects are, then an extra year or two should not matter. If the person who berated you for your speed needs it faster, it likely should not exist in academia; not because it’s slow…but because it likely is not leading-edge enough.

Knowing the field and the idea, it’s leading edge enough that a solution to the problem would be implemented in industry ASAP at any point in the last 10-15 years had it been available. The fact that no one has been able to do it means that it is leading edge (by definition). So I will have to complete disagree with your statement that slow and ponderous leads to “leading edge” with fast and rapid is “not leading edge”

I don’t think that’s what Chris meant. In most cases, implementing cutting edge technology is going to take way more than 6 mos., but those are the product cycles most often used in industry….but new products are not necessarily cutting edge.

I agree with this assessment. I’ve worked in a company doing what was pretty close the cutting-edge for industry and I’ve also spent a few years in grad school now.

In general, I feel that at least in my field, which is computer architecture, industry has much better know how about how to solve the problems that we are facing today or are going to face in the next, say five years. As academics, our job is to look further – probably ten, maybe fifteen years out.

I’ve also also seen a few spectacular failures when academics tried to solve problems that were only a few years out. Two examples I can think of are:

(1) the gate leakage problem where the models predicted that leakage through the gate of the CMOS transistors were going to become comparable to subthreshold leakage. This never happened because industry simply started using high-K dielectrics.

(2) was the soft error problem in microprocessors. The early 2000s saw a lot of research going into designing architectures for “unreliable CMOS”. The hypothesis was that by the time we got to 45nm, CMOS would be extremely vulnerable to transient faults, so we wouldn’t be able to rely on reliable circuit substrate for computation. The solution, some argued was to build reliable architectures based on redundant execution. For a number of reasons, perhaps the most important of which was a misunderstanding of the real soft error in microprocessors and how much technology and circuit solutions could provide cheap mitigating solutions, most of the work on these fancy reliable architectures went down the drain. I should mention that I also added a few dead trees of my own to this area of research before I became an industry engineer and learned that we were all solving the wrong problem.

I think that GEARS is optimistic about the funding cycle—things were like that 20 years ago, but now few proposals are funded on the first submission. It often takes 2–3 years before any funds come after a proposal is written. This may not be a problem in pure science, but in engineering, it is often too slow.

I think that the funding model for research in this country is fundamentally broken, with huge amounts of money and time wasted on chasing funds rather than doing research. Industrial research has almost disappeared (replaced by much shorter time-frame development) and government funding has become more bureaucratic and more “mission-focused” (meaning short-term application rather than long-term creation of new ideas).

I agree about industry being short sighted in general. But disagree about the timelines presented. Sometimes I work on prototypes that we make next week, next month, and next year. Sometimes we anticipate needing a new component design that is years away and slowly make progress on it along with everything else. I think the difference is that cutting edge in research may not make it to industry for 30 years. Or maybe 10 years but some other application. Cutting edge is often very expensive and many of the basics in automotive and aviation applications have not changed much in 50 years. Yes some things have but “better” for commercial industry is different from “better” for research. Eking out a .01% higher temp material is cool and all but industry tends to adopt changes that are easiest for the most output. Changing the flow of my cooling system duct might be less cutting edge than the fancy new material but is easy and it works. I don’t mean to dismiss research but it’s not just a one way street of great ideas always flowing out and into the commercial world.

Pure business people (I’m talking about MBAs and accountants) focus almost exclusively on cost and time. They are under the illusion that they can do “everything” better because they have a clear goal in mind and don’t put up with any namby pamby navel gazing.

One of my bosses in a manufacturing operation was an accountant and he thought that solving problems on the floor was just as straightforward as putting new figures in his spreadsheet. He didn’t understand trial and error as an important engineering activity. It didn’t help that we despised each other.

Aside from a MEng degree, I’ve spent most of my time in industry. I can say that no project I’ve worked on or any project I’ve heard of has ever been on time or on budget.

In one case, I sat by while a VP in charge of new products worked through the ramifications of cancelling a project after $10 million (that’s Canadian, so it wouldn’t be as high in Greenbacks) and 3 years’ labour was expended on a spiffy Java application for surfing the net on your phone. This was in the prehistoric 90’s. Prior to this bombshell, the VP had repeatedly asked such thought-provokers as, “What is this thing really going to be used for?”.

That’s the question you ask at the start of the project, not at the tail end. In this instance, the academic approach of clearly understanding a problem would have been a breath of fresh air.

Oh, and by the way; I don’t think the project was ever officially cancelled. However, equipment and staff were gradually assigned to other projects. So much for facing up to one’s mistakes;-)

Comments are closed.