Search   Memberlist   Usergroups
 Page 1 of 1 [13 Posts]
Author Message
The_Man

Joined: 21 May 2006
Posts: 52

Posted: Fri Jul 14, 2006 6:52 pm    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

Squark wrote:
 Quote: The_Man wrote: uekstrom@gmail.com wrote: Squark wrote: uekstrom@gmail.com wrote: It depends on your definition of large, but hundred atoms is no problem for an energy calculation, even on a single cpu. Probably some density The single point energy calculations are no sweat. It is the geometry optimizations (particularly for TS) which take a lot of time. What is "TS"? Do I understand correctly you are referring to finding

transition state.

 Quote: the locations of the nuclei which yield the minimal energy? What are the methods used for such optimizations? Conjugate gradient?

Conjugate gradient methods are usually the most effective, although
in some cases steepest descent can be used.
What makes transition states so challenging (among other things) is
that you not only have to find A transiton state, you havre to find THE
transition state (the exact one that connects to two minima of
interest).

 Quote: Newton-Raphson? These methods require computing _derivatives_ of the potential, but as far as I understand, standard perturbation theory can do the job.

There are several different methods - Polak-Ribierre, Fletcher-Reeves,
Berny algorithm. Most methods for computing energies (semi-empitical,
HF, DFT) can compute derivatives analytically, which is faster than
doing it numerically.

Perturbation theory generally works poorly. Moller-Plesset perturbation
theory to second order (MP2) is slowly than DFT, and usually much. less
accurate. MP4 is more accurate than DFT, but much more expensive in
terms of computatiional resources. Perturbation theory is
NON-VARIATIONAL, which means that the energy calculated can be lower
than the true energy. In fact, MP2 usually overcorrects for electron
correlation.

 Quote: And don't even get me started about frequency calculations. ... Frequency calculations on a molecule with 100 atoms might or might not be possible, given one's computational resources. Frequency calculations means second derivatives. So I guess Newton-Raphson would be too expensive per iteration?

Yes, second derivatives (Hessian).

 Quote: Transition states by definition involve the making and breaking of bonds. Unless you're considering a transition between different conformations or configurations, but yes, I was referring to the case when bonds are made/broken. One more thing to keep in mind - reaction rates are not always that well known anyway, so keep that in mind when looking at CTST and its breathren like RKKM theory. "Well known" in what sense? Experimentally?

Yes.

 Quote: Best regards, Squark
Squark
science forum beginner

Joined: 14 May 2005
Posts: 33

Posted: Mon Jul 10, 2006 3:41 pm    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

The_Man wrote:
 Quote: uekstrom@gmail.com wrote: Squark wrote: uekstrom@gmail.com wrote: It depends on your definition of large, but hundred atoms is no problem for an energy calculation, even on a single cpu. Probably some density The single point energy calculations are no sweat. It is the geometry optimizations (particularly for TS) which take a lot of time.

What is "TS"? Do I understand correctly you are referring to finding
the
locations of the nuclei which yield the minimal energy? What are the
methods used for such optimizations? Conjugate gradient?
Newton-Raphson? These methods require computing _derivatives_ of
the potential, but as far as I understand, standard perturbation theory

can do the job.

 Quote: And don't even get me started about frequency calculations. ... Frequency calculations on a molecule with 100 atoms might or might not be possible, given one's computational resources.

Frequency calculations means second derivatives. So I guess
Newton-Raphson would be too expensive per iteration?

 Quote: Transition states by definition involve the making and breaking of bonds.

Unless you're considering a transition between different
conformations or configurations, but yes, I was referring to the

 Quote: One more thing to keep in mind - reaction rates are not always that well known anyway, so keep that in mind when looking at CTST and its breathren like RKKM theory.

"Well known" in what sense? Experimentally?

Best regards,
Squark
The_Man

Joined: 21 May 2006
Posts: 52

Posted: Sun Jul 09, 2006 4:55 am    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

uekstrom@gmail.com wrote:
 Quote: Squark wrote: uekstrom@gmail.com wrote: It depends on your definition of large, but hundred atoms is no problem for an energy calculation, even on a single cpu. Probably some density

The single point energy calculations are no sweat. It is the geometry
optimizations (particularly for TS) which take a lot of time. And don't
even get me started about frequency calculations. You need to do a
frequency on each optimized structure (transition state or
intermediate) to do two things: determine that it really is a minimum
(zero imaginary frequencies) or a transition state (exactly 1 imaginary
frequency), and second, to find the zero-point energy. As you are
aware, molecules continue to vibrate even at 0 K, and this zero point
energy (recall that the vibration energy E = (n + 1/2) h nu, so the
energy is non-zero even in the lowest vibration state n=0) is
important. It accounts for some of the kinetic isotope effect.

Frequency calculations on a molecule with 100 atoms might or might
not be possible, given one's computational resources.

 Quote: functional method will give you best results unless you are breaking bonds, which requires a multi-reference approach.

Transition states by definition involve the making and breaking of
bonds.

 Quote: Can you elaborate what is a "multi-reference" approach? Sure. The exact many particle electronic wavefunction can be written as an (infinite) sum of slater determinants ("configurations") constructed from an (infinite) set of one particle wavefunctions (orbitals). In reality you always truncate both the orbital space and the sum over configurations somehow. With a given one particle basis set you might be able to do an exact solution, a full configuration interaction (Full CI), for a small number of electrons. The most crude approximation to this approach is then the Hartree--Fock approximation, where you optimize the orbitals to give the best (lowest energy) single determinant approximation to the full wavefunction. This usually works well in a system with a large bandgap, because in the exact expansion this determinant would have a large weight (0.95, say). On top of this you can then add electron correlation though perturbation theory or other approaches.

HF gives very poor results for transition states; in fact, even for the
ground states for which it designed, it often gives worse results than
semi-empirical methods (which run hundreds of times faster than HF).
Don't use semi-empiricals for transition states, though; the methods
are parameterized for ground states, and give ridiculous values for
transition states.

HF is suited for systems with a large band gap, it is true, but HF
usually overestimates the gap (which is not particularly related to
transition states). Pure DFT methods underestimate band gaps, so if you
want to estimate band gaps accurately, you need hybrid DFT functional
with some admixture of HF exchange, like the ubiquitous B3LYP.

 Quote: However, when you are breaking a bond you get two configurations with very similar energies, and in the full CI expansion you would have two or more determinants with comparable weights. To treat this kind of degenerate systems you need a multi reference approach. The extension of Hartree--Fock to include more configurations is called multi configuration self consistent field (MCSCF). The standard way of adding electron correlation to this is CASPT2 (complete active space 2:nd order perturbation theory). The complexity of these calculations grow quite quickly with the number of configurations, however, and you need some chemical intuition to set up the so-called "active space" properly. Using CASPT2 you pretty much always get reasonable results, but you cannot treat very big systems. Now, since DFT is in principle exact you might think that you wouldn't have to care about the multi reference character of bond breaking. However, standard Kohn-Sham DFT is also based on a reference determinant, and thus share the same problems as Hartree--Fock, although it can sometimes give unexpectedly good results. Multi reference DFT is an active research topic, but as far as I know there is no clear solution to this problem at the moment. If you have access to a research library you can find large amounts of material on this topic in the litterature.

One more thing to keep in mind - reaction rates are not always that
well known anyway, so keep that in mind when looking at CTST and its
breathren like RKKM theory.

 Quote: Ulf
uekstrom@gmail.com
science forum beginner

Joined: 13 Feb 2006
Posts: 3

Posted: Sat Jul 08, 2006 2:36 pm    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

Squark wrote:
 Quote: uekstrom@gmail.com wrote: It depends on your definition of large, but hundred atoms is no problem for an energy calculation, even on a single cpu. Probably some density functional method will give you best results unless you are breaking bonds, which requires a multi-reference approach. Can you elaborate what is a "multi-reference" approach?

Sure. The exact many particle electronic wavefunction can be written as
an (infinite) sum of slater determinants ("configurations") constructed
from an (infinite) set of one particle wavefunctions (orbitals). In
reality you always truncate both the orbital space and the sum over
configurations somehow. With a given one particle basis set you might
be able to do an exact solution, a full configuration interaction (Full
CI), for a small number of electrons.

The most crude approximation to this approach is then the Hartree--Fock
approximation, where you optimize the orbitals to give the best (lowest
energy) single determinant approximation to the full wavefunction. This
usually works well in a system with a large bandgap, because in the
exact expansion this determinant would have a large weight (0.95, say).
On top of this you can then add electron correlation though
perturbation theory or other approaches.

However, when you are breaking a bond you get two configurations with
very similar energies, and in the full CI expansion you would have two
or more determinants with comparable weights. To treat this kind of
degenerate systems you need a multi reference approach. The extension
of Hartree--Fock to include more configurations is called multi
configuration self consistent field (MCSCF).
The standard way of adding electron correlation to this is CASPT2
(complete active space 2:nd order perturbation theory). The complexity
of these calculations grow quite quickly with the number of
configurations, however, and you need some chemical intuition to set up
the so-called "active space" properly.
Using CASPT2 you pretty much always get reasonable results, but you
cannot treat very big systems.

Now, since DFT is in principle exact you might think that you wouldn't
have to care about the multi reference character of bond breaking.
However, standard Kohn-Sham DFT is also based on a reference
determinant, and thus share the same problems as Hartree--Fock,
although it can sometimes give unexpectedly good results. Multi
reference DFT is an active research topic, but as far as I know there
is no clear solution to this problem at the moment. If you have access
to a research library you can find large amounts of material on this
topic in the litterature.

Ulf
Squark
science forum beginner

Joined: 14 May 2005
Posts: 33

Posted: Sat Jul 08, 2006 2:36 pm    Post subject: Chemistry Books (was: Theoretical Estimation of Chemical Reaction Rate Constants)

Olli Lehtonen wrote:
 Quote: The book "Introduction to computational chemistry" by Jensen may be a good start.

I have noticed chemistry books fall roughly into three categories:

1) Books that discuss a lot of methodology, mostly computational
methods
and given few or no examples ("Introduction to computational chemistry"
is
probably in this category).

2) Books that give a lot of examples but more handwaving than solid
physical arguments (usually books with "physical chemistry") in the
title.

3) Books that contain tables comparing results of various computational
methods with experiment, but no in-depth discussion of the origin for
the
discrepancies and what is actually "going on" inside that or another
computational algorithm when applied to a particular example.

I would be glad if anyone directs me to a book that avoids the
shortcomings
of each of these three types. That is, I am looking for a book that
contains lots
of examples (I'm interested in organic chemistry mostly), the results
obtained
upon applying each computation method, and a discussion of why these
results are the way they are, including intermediate output of the
method
(e.g. wavefunctions) rather than only the final answer (e.g. energy).

Best regards,
Squark
Olli Lehtonen
science forum beginner

Joined: 30 May 2006
Posts: 3

Posted: Mon Jul 03, 2006 12:29 am    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

Squark wrote:

 Quote: I'm interested in A) Computational methods available to attack the problem, including methods with are viable for comparatively large molecules.

The book "Introduction to computational chemistry" by Jensen may be a
good start. There has probably been some discussion on reaction rates

regards,

Olli
Squark
science forum beginner

Joined: 14 May 2005
Posts: 33

Posted: Fri Jun 30, 2006 8:21 pm    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

Arnold Neumaier wrote:
 Quote: Squark wrote: 2) The energy of a molecule in given conformation can be approximated using standard methods of molecular dynamics... This approach is not applicable to evaluation the "offshell" energy, i.e. it assumes the covalent bonds are not formed or broken (with rare exceptions such as allowing some bonds, e.g. disulfide, to break/form back). This is not true. The approach is completely general, although many force fields are specific to the motion of single molecules, in which your caveat applies. But one can use the same sort of approximations and constructions to model chemical reactions. Peptid docking studies use such an approach. The main preparatory work for a given class of reactions is the fit of the potential to a large enough data base.

Can you give me some reference employing this approach (to chemical
reactions)? In particular I'm interested in the description of a force
field
considered suitable for such studies.

Best regards,
Squark
Squark
science forum beginner

Joined: 14 May 2005
Posts: 33

Posted: Fri Jun 30, 2006 8:21 pm    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

uekstrom@gmail.com wrote:
 Quote: It depends on your definition of large, but hundred atoms is no problem for an energy calculation, even on a single cpu. Probably some density functional method will give you best results unless you are breaking bonds, which requires a multi-reference approach.

Can you elaborate what is a "multi-reference" approach?

 Quote: Bond breaking in general is difficult to handle, but may not be necessary to estimate the reaction rate constant. It may be the case that the transition state is well described with a closed-shell theory, and thus gives reasonable barrier height.

Can you given an example? By definition the transition state is
unstable
(i.e. a critical point which is not a local minimum) so it's hard to
imagine
describing it without bond breaking. Transition intermediates can be
stable (i.e. local minima) but their energy is lower than the actual
barrier.

 Quote: It would be hard to find a method which you can use both for large molecules and which is suitable for pencil and paper type calculations.

What I meant is two distinct methods, one for A and another for B.

 Quote: Why do you need to look at large systems? Can't you extract the part where the reaction is happening, and treat that more accurately? This is the idea behind the so called QM/MM or ONIOM methods, where you embed a quantum system in a classical MD environment.

Best regards,
Squark
uekstrom@gmail.com
science forum beginner

Joined: 13 Feb 2006
Posts: 3

Posted: Fri Jun 30, 2006 12:20 am    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

Hi.

 Quote: I'm interested in the methods available for estimatating the rate constants of chemical reactions theoretically.

[..]

 Quote: The energy can be evaluated in any given point using computational methods such as Hartree-Fock with configuration interaction, Moller-Plesset perturbation theory etc. This is not realistic for large molecules, however.

It depends on your definition of large, but hundred atoms is no problem
for an energy calculation, even on a single cpu. Probably some density
functional method will give you best results unless you are breaking
bonds, which requires a multi-reference approach.

 Quote: 2) The energy of a molecule in given conformation can be approximated using standard methods of molecular dynamics,

[..]

 Quote: This approach is not applicable to evaluation the "offshell" energy, i.e. it assumes the covalent bonds are not formed or broken (with rare exceptions such as allowing some bonds, e.g. disulfide, to break/form back).

Bond breaking in general is difficult to handle, but may not be
necessary to estimate the reaction rate constant. It may be the case
that the transition state is well described with a closed-shell theory,
and thus gives reasonable barrier height.

 Quote: 3) Given a potential energy surface, the rate of transitions between local minima can be estimated using transition state theory, the reactive flux method, Kramers theory etc. I'm interested in A) Computational methods available to attack the problem, including methods with are viable for comparatively large molecules.

Depending on the accuracy you need I would suggest either DFT (B3LYP or
similar functional) or some semi-empirical method such as AM1, which is
much faster. For small molecules you can do multi-reference calculations
to treat bond breaking explicitly. You can find statistics on the
accuracy of reaction rates in the litterature, and note that most
methods are not very reliable, although you can usually compare two
different reaction paths. I do not have a lot of experience with
molecular dynamics potentials, but I am not very impressed with the
geometries predicted by some of the common forcefields.

 Quote: B) Models that are as mathematically simple as possible that still yield reasonable results.

Something like Huckel theory, perhaps?

 Quote: In principle, 1+3 provides a method but only for small molecules (so it is not a sufficient answer for A) and is very hard to handle analytically ("pencil on paper") or to use for building an intuitive understanding (so it is a poor answer for B).

It would be hard to find a method which you can use both for large
molecules and which is suitable for pencil and paper type calculations.
Why do you need to look at large systems? Can't you extract the part
where the reaction is happening, and treat that more accurately? This is
the idea behind the so called QM/MM or ONIOM methods, where you embed a
quantum system in a classical MD environment. I would find it hard to
build an intuitive understanding for something that contains hundreds of
atoms.

Ulf Ekström
Eugene Stefanovich
science forum Guru

Joined: 24 Mar 2005
Posts: 519

Posted: Fri Jun 30, 2006 12:20 am    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

Squark wrote:

 Quote: I'm interested in the methods available for estimatating the rate constants of chemical reactions theoretically. [...] I'm interested in A) Computational methods available to attack the problem, including methods with are viable for comparatively large molecules. B) Models that are as mathematically simple as possible that still yield reasonable results.

The rate constants evaluation is a tough business. You need to use a
high level ab initio methods to get a decent approximation for the
potential energy surface. Even with best computers one is limited
to molecules with about a dozen atoms. In recent years, more powerful
density functional methods became promising, so a few dozen atoms may be
feasible. Last time I was exposed to this kind of activity was 8 years
ago, but I don't think the situation has changed significantly.

Of course, there are also semiempirical and force field approaches
which can handle hundreds of atoms.
However, as you correctly point out, chemical reactions involve bond
breaking and bond making, but semiempirical parameters are usually
adjusted for stable compounds. So, their predictive power is
questionable.

There is a lot of literature about these subjects. Just browse the
J. Chem. Phys.

Eugene.
lucasea@sbcglobal.net
science forum beginner

Joined: 30 Jun 2006
Posts: 8

Posted: Fri Jun 30, 2006 12:20 am    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

The following is my practical perspective as an organic/inorganic chemist
who frequently uses a variety of computational methods in his work, not from
a person specifically trained as a computational chemist. Take it with the
grain of salt it deserves.

The answer to your question partly depends on what you mean by
"comparatively large". To some, anything bigger than H2 is comparatively
large; to others, a 100-residue polypeptide is small.

In any case, you're going to need a method that is capable of handling the
electrons in at least part of the molecule. As you note,
mechanics/dynamics-only methods are not especially good at handling
molecular energy changes during bond breaking and forming. Fundamentally,
there is no reason they couldn't be, but there are practical barriers.
First, mechanics/dynamics calculates the energy of a particular X-Y bond as
a harmonic oscillator. This is a decent approximation near the equilibrium
bond length, but as bonds break, this approximation obviously breaks down.
Thus, one would need a fundamentally new bond potential energy function to
handle bond breaking, and this would likely make the calculation
significantly more difficult. Maybe a hybrid Lennard-Jones / harmonic
oscillator potential would work. More importantly, however, the influence
of neighboring groups on the transition state are too varied, and you'd have
to have a different bond potential function for each pair of atoms X and Y
in each conceivable substitution pattern, with substituents several atoms
away potentially making a difference. For example, p-nitrobenzyl chloride
will have a very different C-Cl bond potential than p-hydroxybenzyl
chloride. This is a conjugation effect, but the what I say also applies to
through-space interactions. So, for example, tris(cyanoethyl)phosphine is
an exceptionally poor base, due to the through-space inductive effect of the
cyano groups. Any calculation of a nucleophilic reaction of that phosphine
would have to somehow account for that inductive effect.

This means that the methods currently available are in your cateogory 1.
Several computational packages are available, at a wide range of levels of
theory, that can find a saddle point between two local minima on a potential
energy surface (definition of a transition state.) As I understand it (I've
never used them), the two you mention, H-F and MP, are very high levels of
theory and very accurate, but computationally very intensive. Depending on
the size of your molecule, and unless you have a supercomputer, they are not
especially useful for transition state calculations.

However, there are several semi-empirical methods that are much better
suited for this type of calculation. Two of the more popular are Jimmy
Stewart's MOPAC, and Mike Zerner's ZINDO, partly because of their early and
ongoing inclusion in inexpensive computational packages like CAChe. I'm
sure there are others, but I'm not familiar with them. In the past 10 years
or so, density functional theory (DFT) has emerged as a standard. It is
generally much more accurate than the semi-empirical methods, but is
computationally significantly more intensive.

The methods I've listed above are only practically useful for calculations
on relatively small molecules. On a fast modern PC, I've done TS
calculations very conveniently (< 1 hour) on systems having up to a hundred
or so atoms using MOPAC. With DFT, similar calculations require at least
overnight, if not multiple days. Using a computer that's a dedicated
number-crunching box (multiple optimized processors) or a supercomputer
speeds this up, but gets very expensive quickly.

One concept that was being developed in the late 90s, and I assume still is,
although I have to admit I'm a little out of touch with the literature in
the area, is the idea of hybrid methods. The idea is that parts of the
molecule remote from the "action" have little electronic effect on a
reaction, and at best only influence the reaction site by affecting the
geometry of things nearer the action. In that case, people have used
semi-empirical and DFT methods to model things close to the site of the
action (i.e., things that could have an electronic effect on the site of the
reaction), while using much faster mechanics/dynamics methods to model the
rest of the molecule. I think this idea first came out of polymer
chemistry, particularly proteins, so if this is what you mean by
"comparatively large", then this may be your best bet. I don't know if
there are any commercial packages available, so you may be stuck writing
code if you want to do this. The last time I had a chance to use Biosym's
polymer modelling software, they were working on this, but I have no idea
what has happened since Accelerys acquired MSI/Biosym several years ago.

A note is in order about quality of the results. While David Dixon was at
Dupont, he would go around to Gordon Conferences and talk about his goal
(somewhat hyperbolically, I suspect) to take chemistry out of the flask and
put it in a computer. When pressed, even he admitted this was a pipe dream,
but it is easy to fall into the trap of thinking this is possible, if you
don't have an idea of the limitations of the techniques. I don't have
numbers at hand, but I generally don't trust computational methods for
absolute energies to better than +/- 5 kcal/mol. DFT may be good to +/- 1
kcal/mol, I'm not sure. Considering that a DDG* (that's
delta-delta-G-double-dagger) of just 0.7 kcal/mol will double the rate of a
reaction, this means that absolute reaction rates from computational
chemistry are nearly useless. Computational chemistry is most useful in a
comparative sense, since relative computational energies are more accurate
than absolute ones. For example, I would be comfortable predicting relative
reaction rates, to within an order of magnitude, using computational
chemistry...but I wouldn't design a plant based on it (and I would even be
very cautious the first time I tried the reaction in a lab.)

Hope this helps.

Eric Lucas

"Squark" <top.squark@gmail.com> wrote in message
 Quote: Hello everyone. I'm interested in the methods available for estimatating the rate constants of chemical reactions theoretically. I'm familiar with the following: 1) In principle the Bohr-Oppenheimer approximation yields the potential energy of a cluster of atoms as a function of nucleus locations. The energy can be evaluated in any given point using computational methods such as Hartree-Fock with configuration interaction, Moller-Plesset perturbation theory etc. This is not realistic for large molecules, however. 2) The energy of a molecule in given conformation can be approximated using standard methods of molecular dynamics, which means a harmonic oscillator approximation for dependence on bond lengths and angles or treating some/all of those as constrained to their optimal values, a similar simplistic approach to dependence of dihedral angles + electrostatic, hydrogen bond and van der Waals terms. This approach is not applicable to evaluation the "offshell" energy, i.e. it assumes the covalent bonds are not formed or broken (with rare exceptions such as allowing some bonds, e.g. disulfide, to break/form back). 3) Given a potential energy surface, the rate of transitions between local minima can be estimated using transition state theory, the reactive flux method, Kramers theory etc. I'm interested in A) Computational methods available to attack the problem, including methods with are viable for comparatively large molecules. B) Models that are as mathematically simple as possible that still yield reasonable results. In principle, 1+3 provides a method but only for small molecules (so it is not a sufficient answer for A) and is very hard to handle analytically ("pencil on paper") or to use for building an intuitive understanding (so it is a poor answer for B). Thx a lot for any help! Best regards, Squark
Arnold Neumaier
science forum Guru

Joined: 24 Mar 2005
Posts: 379

Posted: Fri Jun 30, 2006 12:20 am    Post subject: Re: Theoretical Estimation of Chemical Reaction Rate Constants

Squark wrote:
 Quote: Hello everyone. I'm interested in the methods available for estimatating the rate constants of chemical reactions theoretically. I'm familiar with the following: 1) In principle the Bohr-Oppenheimer approximation yields the potential energy of a cluster of atoms as a function of nucleus locations. The energy can be evaluated in any given point using computational methods such as Hartree-Fock with configuration interaction, Moller-Plesset perturbation theory etc. This is not realistic for large molecules, however. 2) The energy of a molecule in given conformation can be approximated using standard methods of molecular dynamics, which means a harmonic oscillator approximation for dependence on bond lengths and angles or treating some/all of those as constrained to their optimal values, a similar simplistic approach to dependence of dihedral angles + electrostatic, hydrogen bond and van der Waals terms. This approach is not applicable to evaluation the "offshell" energy, i.e. it assumes the covalent bonds are not formed or broken (with rare exceptions such as allowing some bonds, e.g. disulfide, to break/form back).

This is not true. The approach is completely general, although many
force fields are specific to the motion of single molecules, in which
your caveat applies. But one can use the same sort of approximations
and constructions to model chemical reactions. Peptid docking studies
use such an approach. The main preparatory work for a given class
of reactions is the fit of the potential to a large enough data base.

 Quote: 3) Given a potential energy surface, the rate of transitions between local minima can be estimated using transition state theory, the reactive flux method, Kramers theory etc. I'm interested in A) Computational methods available to attack the problem, including methods with are viable for comparatively large molecules. B) Models that are as mathematically simple as possible that still yield reasonable results. In principle, 1+3 provides a method but only for small molecules (so it is not a sufficient answer for A) and is very hard to handle analytically ("pencil on paper") or to use for building an intuitive understanding (so it is a poor answer for B).

There is no way to calculate nontrivial reaction rates by pencil and
paper. All methods used rely heavily on numerical approaches.
No more than five or six quantum degrees of freedom can be handled
(easily only one or two); so these are usually restricted to
conspicuous dofs at the immediate reaction site.

From a numerical point of view, Kramer's theory is restricted to
reactions which have a fairly definite reaction path. In that case
one can treat the reaction path coordinate quantum-mechanically
and the remainder classically. But large molecules rarely have a
well-defined reaction path...

For large molecules with more than a few dozen atoms, one needs
very simplified models (so-called force fields, or potential energy
with initial conditions drawn from the appropriate Boltzmann
distribution, and count proportion of trajectories which end in
the reaction product.

This is your case 2); it also has the advantage of being closest
to chemical intuition and hence understandable in some sense.
Why doesn't it satisfy your criteria A) and B)?

There are many force fields in current use, depending on the class
of molecules you want to study.

Arnold Neumaier
Squark
science forum beginner

Joined: 14 May 2005
Posts: 33

 Posted: Thu Jun 29, 2006 3:57 am    Post subject: Theoretical Estimation of Chemical Reaction Rate Constants Hello everyone. I'm interested in the methods available for estimatating the rate constants of chemical reactions theoretically. I'm familiar with the following: 1) In principle the Bohr-Oppenheimer approximation yields the potential energy of a cluster of atoms as a function of nucleus locations. The energy can be evaluated in any given point using computational methods such as Hartree-Fock with configuration interaction, Moller-Plesset perturbation theory etc. This is not realistic for large molecules, however. 2) The energy of a molecule in given conformation can be approximated using standard methods of molecular dynamics, which means a harmonic oscillator approximation for dependence on bond lengths and angles or treating some/all of those as constrained to their optimal values, a similar simplistic approach to dependence of dihedral angles + electrostatic, hydrogen bond and van der Waals terms. This approach is not applicable to evaluation the "offshell" energy, i.e. it assumes the covalent bonds are not formed or broken (with rare exceptions such as allowing some bonds, e.g. disulfide, to break/form back). 3) Given a potential energy surface, the rate of transitions between local minima can be estimated using transition state theory, the reactive flux method, Kramers theory etc. I'm interested in A) Computational methods available to attack the problem, including methods with are viable for comparatively large molecules. B) Models that are as mathematically simple as possible that still yield reasonable results. In principle, 1+3 provides a method but only for small molecules (so it is not a sufficient answer for A) and is very hard to handle analytically ("pencil on paper") or to use for building an intuitive understanding (so it is a poor answer for B). Thx a lot for any help! Best regards, Squark

 Display posts from previous: All Posts1 Day7 Days2 Weeks1 Month3 Months6 Months1 Year Oldest FirstNewest First
 Page 1 of 1 [13 Posts]
 The time now is Tue Apr 23, 2019 3:53 pm | All times are GMT
 Jump to: Select a forum-------------------Forum index|___Science and Technology    |___Math    |   |___Research    |   |___num-analysis    |   |___Symbolic    |   |___Combinatorics    |   |___Probability    |   |   |___Prediction    |   |       |   |___Undergraduate    |   |___Recreational    |       |___Physics    |   |___Research    |   |___New Theories    |   |___Acoustics    |   |___Electromagnetics    |   |___Strings    |   |___Particle    |   |___Fusion    |   |___Relativity    |       |___Chem    |   |___Analytical    |   |___Electrochem    |   |   |___Battery    |   |       |   |___Coatings    |       |___Engineering        |___Control        |___Mechanics        |___Chemical

 Topic Author Forum Replies Last Post Similar Topics "Closed" Cycle Internal Reaction Engines Bret Cahill Chem 10 Tue Jul 18, 2006 6:29 am How to combine the results of two Maximum Likelihood Esti... comtech Math 1 Tue Jul 11, 2006 7:42 am what estimation methods can replace Maximum Likelihood Es... gino Math 8 Fri Jul 07, 2006 6:02 am HPLC/UV Effect of Flow rate on Area Count denise.pattavina@gmail.co Analytical 1 Thu Jul 06, 2006 3:43 pm Effect of Flow Rate on Area Count denise.pattavina@gmail.co Analytical 3 Thu Jul 06, 2006 2:32 pm