11 responses to “Theoretical vs. Experimental”

  1. gasstationwithoutpumps

    I think that your view here is very colored by being a mechanical engineer. I used to teach VLSI design, where fab was expensive, took several months, and there was no easy way to probe the design if it failed. Simulation is absolutely essential in that field.

    Nowadays, the things I design exist entirely as software. We don’t simulate the software—we build it and test it, but it lives entirely within the computer. A balance of theory and practical programming skills is necessary to get good results. Theory alone is useless, as is programming without deep understanding of the algorithms to be implemented.

  2. db

    Pretty much similar situation as the previous commenter. I am an ME but was doing some work in microfabrication a few months ago. Doing the actual fabrication is very expensive and time consuming. Alternatively I could model the system instead in a couple of days (compared to a couple of months to make a prototype).

  3. brainfreeze

    Optimization is another reason for models, even in the ME world. There is a limited degree to which you can tune a design once it is constructed. Sure, you can muck around with anything that is controlled by software, but, to go with the engine example, you can’t exactly vary the bore of your cylinders in order to squeeze out a few extra HP once you’ve got the engine block built.

    Another way to think about this whole thing is that models are like guns: they are neither good nor bad unto themselves but can do lots of damage in the hands of someone who doesn’t know what they’re doing.

  4. Cherish The Scientist

    I have to agree. I’m definitely on the theoretical side, and I believe it’s important to have a healthy skepticism of modeling. That said, this sounds a bit more like all-out rejection.

    Modeling doesn’t take into account some of the unknowns, but there are a lot of ‘knowns’ that can be tested and optimized before a prototype is ever made. For instance, the widget I just made: I spent a lot of time modeling, and I found out that certain materials worked a lot better than others. I wasn’t able to purchase all the materials to try them out, but the ones we were able to get validated the things we saw in simulation. So rather than having the undergrad try all this stuff that didn’t work and giving up, we were able to try it out in simulation and find that only one or two things worked. Therefore, once we got a working widget, we were able to spend more time optimizing and experimenting, rather than wasting all our time trying to build things that would not work…not to mention reducing our materials expense.

    Also, modeling *is* fun. Building models, for me, is a lot like building physical objects for people. If I have an idea, I can use modeling to see if it’s valid or not. I admit that I feel even better when the physical object has been built, and my models have been validated. I, however, really like planning and thinking about things and am not very good in the lab (I have a Pauli effect – http://mareserinitatis.livejournal.com/558898.html). Modeling is the best way for me to do it.

  5. Jacob

    Another important role of models is to understand the results from a puzzling experiment. Sure it might be much faster to run the experiment than to develop the model – but if the behaviour is not what you expect, how do you explain it? If the experiment shows some unwanted vibration or inexplicable runaway heat generation, how can you correct it?

    A (good) model can help to find the cause of such problems, and test out various virtual solutions inexpensively. It goes hand in hand with experiment because if your model predicts the same results as you see in the experiment (even when you change something) then you also know the model is a good one, and can have some confidence in it.

    Another advantage of a model is that it can tell you things that are very hard to probe in an experiment. For example, it’s easy to measure the temperature on the surface of objects, but more difficult to measure the temperature on the interior of something, especially if circumstances don’t allow you to bore a hole to insert a sensor probe. (Perhaps the material is too brittle). So a thermodynamic model that matches observed temperatures on the surface could tell you whether temperatures are rising to a dangerous level in the bulk. And then just for fun it could tell you about the stress distribution caused by thermal gradients. =)

    Anyway so I think that there shouldn’t be a war between theory and experiment. Both sides really need each other!

  6. SimJockey

    Other commentors have hinted at this, but it’s worth explicitly calling out. Simulations can help you model things which are impossible to build in the real world, but can give extremely useful insights into system behaviour.

    For instance, I could build a model of a microprocessor with perfect caches and branch prediction, and this would help me get a very good idea of bottlenecks inside the out-of-order core of a modern out-of-order superscalar. You could build a model that had an infinite number of functional units – again this gives you insight into other bottlenecks limiting instruction-level-parallelism.

    I realize these examples may seem like gibberish to the untrained, but my point is that models can do things that are _impossible_ to do with real experiments. And sometimes, doing the impossible is extremely useful.

  7. is simjockey a monkey? « simulation jockey

    […] Miss MSE has entered the model vs. experiment debate and she’s firmly in the experimental camp. […]

  8. riven

    Theory versus experimental
    I am writing this from the perspective of a chemical engineer working in the research field. My goal is to take a particular type of ‘filter’ and scale it up from lab scale (1 L, a few weeks) to pilot or demonstration scale (10000 m3 9-18 months). Most of the work occurs at the lower end of the scale and only successful filters proceed to the larger scale. The filter is a process for removing specific impurities from a solvent or oil system (usually 10% down to 0.5%). The filters are made by material experts (who only simulate the materials side and not the process) and we have a specific simulation expert for assessing the potential for energy and economic savings of the process in industry. Thus we have three disciplines; hands on engineering, material scientists and simulations engineer. In the past I have made extensive models (with and without simulation software) but I do not currently do this though I can critique any model that passes my way. Below is simulation versus hands-on.
    The first question to answer is why do modeling at all? There are obviously a number of reasons but the two most important is time saving and cost. Models allow an engineer model to cut out work in the future of the/other projects and gives the ability to take legitimate and correct short cut decisions. This can lead to savings which may not be apparent with the experimental route. A particular filter can be tested for a short time. If its performance does not meet the needs as set out by a model (i.e. minimum performance needs to be..) then this filter can be discarded. The equipment does not need to be build and one can move on to other materials or projects.
    Once I have tested on lab scale and I find that the simulation is promising or vice versa, I or somebody else (say if company X buys the rights) have to prove this by building the larger scale process Both filter area and consequently volume area scaled up to see if making the filter is reproducible and to see if the performance is reproducible across that area. This data can then be re-entered input the simulation or into a more complex simulation. In any case some amount of data is needed or some amount of simulation is needed (if an idea of filter performance is known for literature) to start the process. Both are needed to progress further in the stages.
    The problem with scale-up is staying grounded in reality. For a hands on engineer that usually is not a problem. Equipment issue at all scales arise and have to be dealt with. Knowing the limits of the material comes about from experience using said material. However for the person doing the simulation, there is little such knowledge. Running simulations often give an aura of credibility to a process that may not be grounded in the realities of the underlying materials capabilities. Process modeling and simulation tools do not provide the ability to generate, deploy and run executable processes. They tend to gloss over technical problems and limitations of equipment. Thus much misplaced confidence can be placed on the results because the simulation says so.
    Both are needed but an experienced engineer is required to interpret the results. Without such, models can lead down the wrong path.

  9. GMP

    Wow. This was a surprisingly dismissive post.

    Many commenters above have said all there is to be said, but let me reiterate: (a) Often trial and error is too expensive and/or time consuming. Microelectronic industry had funneled tons of money into theory and simulation precisely so they wouldn’t have to burn orders of magnitude more on just trying and seeing if something works. (b) I do theory/computational modeling and it seems to me you haven’t really worked with many good theory people. There is a lot of BS theory, but done right it can be extremely useful and predict exactly what you will get in well controlled experiments. We have done it numerous times. We can also tell you where to look and where not to look — the best theorists are well aware of the experimental limitations and constantly talk to experimentalists. I have been in each of my collaborators labs numerous times, we discuss experimental details at great length. Doing theory does not mean being oblivious to reality. (c) Also, there is nothing better than theory to explain puzzling data — in theoretical models you can turn on and off certain physical aspects and see what dominates. That’s how we develop deeper understanding of what goes on inside physical systems.

    Nowadays, with the use of computers, we are no longer constrained to oversimplified models of spherical cows and what not; those are the things of the past. Theory with good predictive capabilities is nowadays done on the computer and models can be very realistic and very sophisticated. The absolute best papers are those that combine theory and experiment — verification that something is happening accompanied by a detailed explanation of the underlying mechanisms.

    I am sorry you feel that theorists are useless in your field; perhaps you just had the misfortune of talking to those who are too old school or simply disinterested in the real world. I am very happy that my collaborators don’t share your sentiments (my field of research falls under applied physics).

    To wrap it up, here’s a link to a related post:
    Experiment or Theory?

  10. Krishna Chaitanya

    @Miss Outlier

    First of, your blog is an absolute delight to read. Though admittedly out of context, The reason i stumbled into this post is when i came across your post on ‘A Place to tinker”. I am envious of your workshop to say the least.

    @ the above commenters

    The port started with a statutory notice about good ribbing. I think that the posts were being kind of harsh. It is all about her point of view on the topic. Having said that, Miss Outlier has invited this ire by calling theorists “unnecessary”.

    I am an electrical engineer by profession and i firmly believe that both experimental and theoritical approaches are necessary. The ratio of these two approaches varies depending on other factors like say complexity or enviromental or stream or even time.

    As i see it, both experimental and theoritical approaches are complementary (maybe supplementary sometimes). Simulations ARE based on theory (the other kind, maths and equations). They can only model a system to a certain degree, based on known “knowns” and known “unknowns”. It is only by experimentally verifying the system that other unknown “knowns” and unknown “knowns” can be found out.

    Here is the beauty. These knowns and unknowns can then be modelled into the system creating a more accurate behavioral representation. But nothing like experimentally verifying to see if it works or even to reduce the knowns and unknowns.

    And our present simulators ( in most disciplines) are based on years, if not decades, of experimental data.

    On a lighter note, You can write a peice of software to very accurately model the baking of a cake taking into account the temperature, oven, ratio of compounds and experiment with the simulator in the PC, but nothing like physically having the cake to taste it.

    Thank you

  11. Rachel

    This is so spot on. There is a huge difference between people that are excessively theoretical vs application people. People who focus upon functional application within a real world concept just don’t have time to muse (read time waster) about theoretical operations. Thank you for writing this – just for the mere validation that it provides applied thinkers.

    I wonder what the theorists say about us? Wait, no I don’t want to know.