Predicting Melting Points.

One feature I have wanted to add to ChemSi is the ability to have state symbols - that is, instead of outputting

C3H8 + 5O2 -> 3CO2 + 4H2O

It would instead output

C3H8(g) + 5O2(g) -> 3CO2(g) + 4H2O(l)

In order to do this I needed some way to predict the melting and (eventually) boiling points of different compounds, given only a little information about them.

My initial solution was to simply use enthalpy and entropy changes with the Gibb's equation. If I could get the entropy change from solid to liquid, and the enthalpy change, then I could use




This solutions works - to a point. While for chemicals I have the entropy and enthalpy data for it works very nicely, I do not have the data for all states of all compounds - and it is unreasonable to get this. Additionally, this felt like a bit of a cop out. The point of ChemSi is to try and calculate (or at least, approximate) these values from easily accessible theoretical data, such as is done with the reaction prediction. If I wanted it to predict them excellently I could just provide it with some rules (ie, compound containing C, H, and O reacts with O2 to produce blah) or even worse, just give it a list of the reactions.

After some Googling, I came across the Lindemann criterion. Essentially it boils down to

T_{m} = \frac{2\pi mc^{2}a^{2}\theta^{2}_{D}k_{B}}{h^{2}}

Where the 'dodgy constants' are \theta_{D} (Debye temperature) and c. The rest are either pretty standard constants (the Boltzmann constant, for example) or pretty easy to get. The reason that the dodgy constants were a problem is that they require information on the geometry of the crystalline material - which was quite difficult to get access too.

Just a note on that crystalline material part - almost all of these prediction methods (bar the enthalpy and entropy change one) are limited to Ionic crystals, and are entirely useless for other molecules.

Following this, I searched around some more. I had the idea that perhaps the Lattice Enthalpy would have some relationship with the melting temperature - after all, surely a lattice with a more negative lattice enthalpy would be more strongly bonded so more energy would be needed to break it apart? I decided to test this hypothesis.

For a sample of ionic lattices, Na-Cs and F-I (Only using the group 1 metals and halogens may be a bit of a limitation, which led to me having to do some fiddling later on to fit the other groups. At this stage, however, I was looking for a relationship. Lithium is not included as bonds with Lithium tend to be more covalent in character.) I was able to produce the following graph. The data was taken from this website - I know it refers to Lattice Energy but from the explanation with it I believe they are referring to the Lattice Dissociation Enthalpy so it is still usable in this context.



As you can see, this produces a reasonable correlation - the pairwise correlation coefficient calculated by Octave is 0.9300.

I decided to go ahead and use this lattice enthalpy method to predict the melting points. Unfortunately, as you may have expected, I ran into the same issue as the Gibbs method - I can't really store enough lattice enthalpies! Luckily, there is an equation called the Born-Lande equation

 E = -\frac{N_{A}Mz^{+}z^{-}e^{2}}{4\pi \varepsilon_{0}r_{0}}(1-\frac{1}{n})

This does not produce exact Lattice enthalpies - it is more of a prediction. This method is used in predict_mp_alg2().

I also decided to try and see if I could modify the equation used to calculate the shell energy that is already used for reaction prediction to approximate lattice enthalpy. I won't go into detail about it here, but it is given as predict_mp_alg1() in ChemSi if you want to check it out.

Just to show these equations working, I decided to predict the melting points of three compounds using both the Lattice Enthalpies (Born-Lande) and my shell energies. The compounds I chose were NaCl, FeCl2, and CsI.

Compound Actual MP (K) Shell Energy MP (K) Born-Lande MP (K)
NaCl 1074 1088 1122
FeCl2 950 993 895
CsI 894 1084 819

As you can see, both give a reasonable (+- 50) approximation for NaCl with the shell energy giving the best prediction; this is mirrored with FeCl2 where shell energy is closest (but both are spread further than for NaCl), and finally in CsI the Born-Lande approximation is closest, with the shell energy prediction being too far out. This is likely to do with the shell energy calculation getting less accurate at higher atomic numbers.

Both algorithms have been left in ChemSi so the differences can be seen.

All the Light We Cannot See.


Price (as of writing): £6.29

Publisher Synopsis: 

A beautiful, stunningly ambitious novel about a blind French girl and a German boy whose paths collide in occupied France as both try to survive the devastation of World War II

Open your eyes and see what you can with them before they close forever.’

For Marie-Laure, blind since the age of six, the world is full of mazes. The miniature of a Paris neighbourhood, made by her father to teach her the way home. The microscopic layers within the invaluable diamond that her father guards in the Museum of Natural History. The walled city by the sea, where father and daughter take refuge when the Nazis invade Paris. And a future which draws her ever closer to Werner, a German orphan, destined to labour in the mines until a broken radio fills his life with possibility and brings him to the notice of the Hitler Youth.

In this magnificent, deeply moving novel, the stories of
Marie-Laure and Werner illuminate the ways, against all odds, people try to be good to one another.


This is a hauntingly beautiful book, showcasing a juxtaposition between the horrendous nature of war with the kind caring nature of both Marie-Laure and Werner. It also shows how different people do not necessarily act as would be expected in their situations. I initially bought this book without reading any synopsises or reviews, to avoid even the minutest spoiler, and I was blown away by what I found. I am going to read more of Anthony Doerr's books, and I would wholeheartedly recommend this book.

The Shark and the Albatross.

Price (as of writing): £8.99

Publisher Synopsis: 

For twenty years John Aitchison has been travelling the world to film wildlife for the BBC and other broadcasters, taking him to far-away places on every continent. The Shark and the Albatross is the story of these journeys of discovery, of his encounters with animals and occasional enterprising individuals in remote and sometimes dangerous places. His destinations include the far north and the far south, expeditions to film for Frozen Planet and other natural history series, in Svalbard, Alaska, the remote Atlantic island of South Georgia, and the Antarctic. They also encompass wild places in India, China and the United States. In all he finds and describes key moments in the lives of animals, among them polar bears and penguins, seals and whales, sharks and birds, and wolves and lynxes.

He reveals what happens behind the scenes and beyond the camera. He explains the practicalities and challenges of the filming process, and the problems of survival in perilous places. He records touching moments and dramatic incidents, some ending in success, others desperately sad. There are times when a hunted animal triumphs against the odds, and others when, in spite of preparation for every outcome, disaster strikes. And, as the author shows in several incidents that combine nail-biting tension with hair-raising hilarity, disaster can strike for film-makers too.

This is natural history writing at its absolute best, evocative, informative and gripping from first to last.


I read this book following on from reading the Great Soul of Siberia, and it is along a similar vein to it. Both books take you on journeys to places on the globe you may not normally be able to visit, and through extraordinary writing describe in extreme detail what is found there. The Shark and the Albatross is a lot more varied, in that it takes you further (through India, the Arctic, America and so many other places), but it covers these visits in less detail. You also don't get as deep of a story with each of them, but more a snapshot - which is surprisingly apt for a book about nature photography. I enjoyed the book, but I did find that by separating out the different locations into different chapters throughout the book it didn't follow on as easily - I found myself having to return back to a previous chapter to reread it. Regardless, it was a very beautiful book to read and I would recommend it greatly.


Recently I was reading the excellent Serengeti Rules by Sean B. Carroll, and then this happened.

A few months ago, I wrote the population simulator that I wrote about here, but this purely dealt with so called 'bottom up' regulation - the primarly limiting factor of the population was the amount of food, and an artificial carrying capacity. While reading the Serengeti Rules, it got me thinking about 'top down' regulation - ie, big fish eats little fish, so there are fewer little fish to eat seaweed and other fishy things. I felt that the artifical carrying capacity was just that - artificial - so I set about trying to write a new simulator, which attempts to create a very simple (really simple) model of an ecosystem (more a food web).

Essentially, the model has three numbers - the population of a producer, such as a plant, the population of a primary consumer, such as a rabbit, and the secondary consumer, such as a fox. Each 'cycle' each of these changes - depending on which model I use I can make the producer either increase linearly (which allows both consumers to increase rapidly), be constant (in which case I get a realistic carrying capacity), or change based on a wave pattern (which sort of simulates seasons). This causes changes in the producer. Then, using a ratio of the prey to producers and a random number generator, it is decided whether the prey either reproduce (producer -2, prey +1), stay alive (producer -1, prey +0), or die (producer +0, prey -1). A similar procedure is done on the predators (secondary consumer).

The output of this program being run under different situations is shown below.

No Pred, Const food.
Sim 1; No predators are present, and there is a constant amount of food. The prey population (green) increases and then is relatively stable.


Sim 2; Same as Sim 1, only predators are introduced. While in Sim 1 there was only bottom up regulation, in this one there is top down regulation too - the presence of the predators keeps the prey population at a lower level.
Sim 3; Limited amount of food, small population of prey. No predators. Prey consume all the food, and starve themselves to death.
Sim 4; Prey, predators and large amount of initial food. Small growth rate of food. Initial large amount of food supports a large population of prey, but eventually much food is used up so the prey population slumps. This means there is less pressure on the food so it returns to the original levels.


Sim 5; Initial high concentration of food, so prey peak. Food is regulated by seasons and so when there is lots of food available, more prey are alive so the predator population increases.

This program can be found on my github. It is probably the least taxing of all simulations I have written so far.

The Great Soul of Siberia.

Price (as of writing): £8.99

Publisher Synopsis: 

There are five races of tiger on our planet and all but one live in tropical regions: the Siberian Tiger Panthera tigris altaica is the exception. Mysterious and elusive, and with only 350 remaining in the wild, the Siberian tiger remains a complete enigma. One man has set out to change this.

Sooyong Park has spent twenty years tracking and observing these elusive tigers. Each year he spends six months braving sub-zero temperatures, buried in grave-like underground bunkers, fearlessly immersing himself in the lives of Siberian tigers. As he watches the brutal, day-to-day struggle to survive the harsh landscape, threatened by poachers and the disappearance of the pristine habitat, Park becomes emotionally and spiritually attached to these beautiful and deadly predators. No one has ever been this close: as he comes face-to-face with one tiger, Bloody Mary, her fierce determination to protect her cubs nearly results in his own bloody demise.

Poignant, poetic and fiercely compassionate, The Great Soul of Siberia is the incredible story of Park’s unique obsession with these compelling creatures on the very brink of extinction, and his dangerous quest to seek them out to observe and study them. Eloquently told in Park’s distinctive voice, it is a personal account of one of the most extraordinary wildlife studies ever undertaken.


This is a beautiful book. Park's writing made it feel as though I was actually there, and makes it feel all the more real. Despite essentially telling the end of 'the story' at the beginning (which in some books could spoil it, in this one it only makes it feel even more special), it is a gripping read, and without ruining the ending it really provides an insight into the lives of these tigers, and the dangers and troubles they face. Through some extremely detailed prose Park takes you on a journey through the lives of these tigers, gripping you at every corner. I would strongly recommend this book to anyone, but be careful when reading the last few chapters!

Inheritance/Population Simulation.

Over the past few weeks I have been considering trying to write a sort of population simulator - ie, you have a sample of say 100 individuals (very basic individuals) and you let it run. They have some genes (currently the code has two, but it could be expanded) and these genes affect how long they live, how sexually mature they become, basically how likely they are to pass on their genetics. It would therefore show a sort of selection, and because of this would also show a population maximum.

This weekend I decided to finally start writing this, and so far it is going well. It currently has a hard set population maximum (5,000, but it can be changed in the code), however I am planning on changing this so it has a set amount of food in the environment, and as individuals grow they take up more food depending on their gene code, and then when they die this food (- some which cannot be released) is given back into the environment. If the food runs out, some organisms may have an ability to survive without, but others won't - so they will die out but the survivors will have more food.

At the moment, it creates 100 individuals with two random genes - x and y - which determine their survival characteristics. Currently it is based around the following equations;

age_{max} = 2000|sin(\frac{x}{10}) \cdot cos(\frac{y}{10})|

age_{spread} = 1000|sin(\frac{y}{10}) \cdot cos(\frac{x}{10})|

This means that as the x and y values change, the maximum age and spread change - so certain configurations will be more stable than others. The probability of reproduction is modeled as a normal distribution using;

\frac{1}{spread\cdot \sqrt{2\pi}} e^{-\frac{(age - \frac{age_{max}}{2})^{2}}{2\cdot age_{spread}^2}}

This gives a probability of reproduction, so the survival is sort of down to chance. If an individual reproduces then their genes change as they are passed on (only slightly) so the most successful genes are likely to survive. At the moment (without the food construct) individuals die at their maximum age multiplied by a factor which is dependent on the number of individuals in the population;

age_{relmax} = age_{max} * (1-\frac{pop^3}{pop_{max}^3})

So, as the number of individuals increases, the age at which individuals dies gets smaller - so in the graphs we see a boom bust population change, as when it gets too big individuals die, and when it gets too low they live longer.

Hopefully this new project will continue to grow. For now, the source is on github, and an image of the program is shown below.


The population reaches a stable position which is just below the max capacity.
The population reaches a stable position which is just below the max capacity.

Christmas and New Years!

I just wanted to take this time to wish everyone a very (late) Merry Christmas, and a Happy New Year +6! I hope everything over the last year went well, and that the year ahead will be even better!

ChemKit is currently in hiatus as I have mock exams coming up and other work to do (dreaded coursework soon, the problem of doing science A Levels!). I am also thinking of writing some sort of population simulator to aid with biology - think of it as in you have a population of say 100 organisms who have different genetic fingerprints (say 20 digits long). Each digit has a specific advantage or disadvantage, and they may affect each other in different ways. Sexual reproduction could be shown through the combination of different genetics to produce offspring, and through probabilities the likelihood of a member reaching sexual maturity could be shown, hence showing inheritance. I could then have certain factors which could change (and maybe be affected by the organisms - ie a lot of organisms might lead to an increase in temperature) which would then show selection. As the different genes would be interfering it could mean that many solutions are stable (think of it like equation solutions) and this could show speciation as the populations diverge.

In terms of reading, recently I have read "Life's Greatest Secret" by Matthew Cobb, "All the Light We Cannot See" by Anthony Doerr, "Human Universe" by Brian Cox and "The Secret War" by Max Hastings. I enjoyed each of these books, and looking back on it now I have apparently been reading quite varied recently. I am currently reading "All Hell Let Loose" by Max Hastings, and I am going to be reading "The Mysterious World of the Human Genome" by Frank Ryan, "Life Unfolding" by Jamie A. Davis, and possibly having a reread of Metro 2033.

Anyway, as said, I hope you all had a nice Christmas and have a brilliant new year.


Changes to ChemSi

Recently I chose to rewrite ChemSi in its entirety to clean up the code base and add some new features. Now, instead of only having a text based front end which limits the usability (so you can't have cyclic reactions, etc) and simultaneously makes it harder to maintain (I had wanted to use electron energies instead of electro negativities) I chose to write it in an object orientated manner. This enables it to be used as a Python module, as opposed to a program.


import chemi

the module will be loaded, and commands can be used within it. For example;

import chemi

q = chemi.periodic_table["Na"].out(1)
print q["name"]
print q['shells']['3s0']
print q
s = chemi.Reaction(300)
q = chemi.Reaction(300)
s.reactants.append(chemi.Compound("NaBr", 0, 0, []))
s.reactants.append(chemi.Compound("NaBr", 0, 0, []))
s.reactants.append(chemi.Compound("Cl2", 0, 0, []))
print(chemi.output(s.return_reactants()) + " -> " + chemi.output(s.return_products()))
q.reactants = s.products
q.reactants[2] = chemi.Compound("F2", 0, 0, [])
print(chemi.output(q.return_reactants()) + " -> " + chemi.output(q.return_products()))

Will give the following output;

ben@Nitrate:~/Development/ChemSi$ python
{'energy': -1.5, 'number': 1}
{'molar': 23.0, 'electronegativity': 0.9, 'name': u'Sodium', 'shells': {'2p1': {'energy': -166.7, 'number': 2}, '2p0': {'energy': -166.7, 'number': 2}, '2p-1': {'energy': -166.7, 'number': 2}, '3s0': {'energy': -1.5, 'number': 1}, '1s0': {'energy': -1646.3, 'number': 2}, '2s0': {'energy': -275.5, 'number': 2}}, 'an': 11, 'small': u'Na'}
Cl2 + 2NaBr -> Br2 + 2NaCl
F2 + 2NaCl -> 2NaF + Cl2

Here I have done something which would've been impossible in a previous version of ChemSi. I have first of all used the new periodic table definition to return some information about sodium - the name, the 3s0 shell energy (kJmol^-1, approximated using the Rydberg equation) and then a general print out of it. I then go on to add some Sodium Bromide to reaction s, and some chlorine. I then get it to use my algorithm to predict the reaction products, before printing out the reaction;

Cl_2 + 2NaBr \rightarrow Br_2 + 2NaCl

Which it predicts correctly. I then move the products into reaction q, but replace the bromine with fluorine. I then predict them together, and I get;

F_2 + 2NaCl \rightarrow Cl_2 + 2NaF

These predictions are typically more accurate than the old model - for example;

FeBr_3 + Al \rightarrow AlBr_3 + Fe

In the previous ChemSi code this would've given AlBr and FeBr2 - which isn't accurate. So, in some respects it is getting better. On the other hand, because the algorithm I am using to fill the shell energies has faults with transition metals and elements with a 4s or higher orbital (it fills them going up in n,l,m ie 1s, 2s, 2p, 3s, 3p, 3d, 4s as opposed to the actual energies) it means that the above demonstration with NaI will not work accurately - the Chlorine is calculated to be higher and so is not substituted. Regardless, the new prediction engine seems to work better - fixes will be released in the coming weeks to fix these issues.

The code is already up on GitHub and along side it you will find a new class definitions file which explains each of the new classes.

Junk DNA

Price (as of writing): £8.99

Publisher Synopsis: 

For decades, 98 per cent of our DNA was written off as 'junk' on the grounds that it did not code for proteins. From rare genetic diseases to Down's Syndrome, from viral infections to the ageing process, only now are the effects and the vital functions of these junk regions beginning to emerge. Scientists' rapidly growing knowledge of this often controversial field has already provided a successful cure for blindness and saved innocent people from death row via DNA fingerprinting, and looks set to revolutionise treatment for many medical conditions including obesity. From Nessa Carey, author of the acclaimed The Epigenetics Revolution, this is the first book for a general readership on a subject that may underpin the secrets of human complexity - even the very origins of life on earth.



I originally purchased this book as a follow up to Epigenetics Revolution and immediately I noticed the similarities. Both are similar both in content and in style (which in all honesty is to be expected given they are about very related topics by the same author). Like with the Epigenetics Revolution I was impressed - Carey explained the main theories behind epigenetics well in a manner which was easily readable, and provided examples of different effects of the different modifications. The Agouti mice made a reappearance, however the explanation of these was very similar to in Epigenetics Revolution.

I found the information within the book interesting as it followed on from genetics at school in which introns are taught as being useless waste - Carey does a good job of dispelling this and providing an explanation of what they do and how they affect life - from telomeres and longevity to inactivation of the space X Chromosome in females by Xist and Tsix.


I am currently reading 'The Blind Watchmaker' by Richard Dawkins but will probably not write a review on this (it has been out for ages and is pretty well documented).


Apparently next week actually means next month. Sorry for the delay, I have been busy!
Following the last post regarding entropy and Gibb's calculations I decided to change the aim of this to entropy. This was originally written as an essay for physics so please excuse any dodgy wording or citation marks. Furthermore, it is a 1st draft - so it may have mistakes and other issues. One good side effect of this is it has a proper list of sources at the end!

Entropy is a measure of the multiplicity of a thermodynamic system, or how disordered a system is - the higher the entropy, the more ways in which it can be arranged, and generally this means that it will be more disordered. Imagine a brick wall - it is ordered, so has a low entropy, but if knocked down it will become more disordered - it will have a higher multiplicity as the fragments of wall can be ordered in more ways than the original wall, and hence a high entropy. It is often easier to knock down a wall than build a wall, meaning that generally entropy increases. It can also be seen as the amount of energy unavailable to do work. If energy is stored in a low entropy glucose molecule it can be used to work. If the energy is spread throughout some paper as heat then it is much harder to be used to do work, and so the paper has higher entropy.
According to the second law of thermodynamics the entropy of the universe will always increase, and over time the multiplicity of the universe will increase. This means that entropy is one of the only physical quantities which works in a direction in time. If time travel ever occurred entropy would be running in reverse. For example, while a thermometer can go up and down, it would be very unlikely to see a smashed window suddenly reassemble, yet as entropy would be in reverse (ie decreasing) it would reassemble easily.
Due to the second law of thermodynamics predicting the increase of entropy it has an important role to play in the predicting how chemical reactions will occur. The first section of this report will concern itself with how entropy changes work, moving on to Gibb’s Free Energy, and the last section will be about the heat death of the universe and black holes.

Physical Uses of Entropy

The idea that entropy always increases may appear rather odd - if entropy always increases, how come water can be frozen? How come brick walls can be built? The reason these things can occur is due to the fact that it is the universe’s entropy, \Delta S_{univ}, which increases, not necessarily the entropy of the system, \Delta S_{sys}. As long as the sum of the change in entropy of a system and the change of entropy in its surroundings is positive, ie the entropy over all increases, it can take place. Therefore we can say that

\Delta S_{univ} = \Delta S_{surr} + \Delta S_{sys} > 0

And in order for a reaction, or physical action to take place, \Delta S_{univ} must be positive. For example, when freezing ammonia the reaction is exothermic, so heat is lost to the surroundings. This means that the surroundings get more energy and so become less ordered, therefore while the entropy of the frozen ammonia, \Delta S_{sys}, has decreased, in doing so the entropy in the surroundings has increased, meaning the overall change in the universe is positive. On the other hand, when ammonia is melting the \Delta S_{surr} is negative - it is endothermic so takes in heat. Luckily for the Laws of Thermodynamics, solid ammonia melting produces liquid ammonia, and the molecules in liquid ammonia can be arranged in more ways than in solid (they are free to move, so have higher entropy) - so \Delta S_{univ} is still positive.
This relationship is very useful for working out the temperatures at which reactions like this will occur. Another example may be the brick wall. In order for a brick wall to be built by hand, carbohydrates are broken down by the body which increases the multiplicity of molecules in the body - the multiplicity of the surroundings increases, so the total entropy change of the universe is positive.
In relation to solid ammonia and liquid ammonia, as above, entropy allows for the calculation of the freezing point of ammonia. The entropy change of the surroundings of a material can be calculated as follows;

\Delta S_{surr} = \frac{\Delta q_{surr}}{T_{surr}}

Where q_{surr} is the heat absorbed by the surroundings - the enthalpy change - (measured in Joules), and T_{surr} is the absolute temperature of the surroundings (in Kelvin). Typically, q_{surr} is given per mole which gives the entropy per mole. The reason T_{surr} is the divisor is because the hotter something is, the lower the change in entropy for a given additional amount of energy - imagine a brick wall being pushed over, and a pile of bricks being pushed over - the brick wall has a larger entropy change as it is going from very orderly to not.
Given that \Delta S_{univ} must be positive, this allows the calculation of the freezing/melting point of ammonia. From looking in the US NIST Chemical WebBook we can find that the entropy change of fusion (this is the entropy change when melting - fusion is another name for melting) for ammonia is 28.93JK^{-1}mol^{-1}. As we are looking at when it is freezing we can change the sign - the entropy change for melting is opposite the entropy change of freezing. We then need the standard enthalpy change of fusion , which the same data source tells us is 5653Jmol^{-1}. Putting this into the above formulas gets us the following;

\Delta S_{univ} = \Delta S_{sys} + \frac{\Delta q_{surr}}{T_{surr}} = -28.93 + \frac{5653}{T_{surr}}

And by plotting this we can see where the entropy of the universe is positive, and therefore the fusion point of ammonia.

Screenshot from 2015-10-08 18:32:37

As we can see in Figure 1, the entropy change of the universe is positive whenever T_{surr} < 195.4K. From looking in the aforementioned data book we find that the temperature of fusion, T_{fus} is 195.4K - so we have the correct point. This same relationship works for any other thermodynamic system.

Gibbs Free Energy

Gibbs Free Energy is a way of doing the same calculations as we just did to work out the fusion point of ammonia, only it considers purely the system and not the surroundings which makes it easier to deal with. The Gibbs Free Energy is in essence the amount of energy needed on top of the energy contained within the system for a reaction to go - so if it negative then the system has sufficient energy, and some free energy, however only works at constant pressure. The equation for Gibbs Free Energy is given as follows;

\Delta G = \Delta q_{sys} - T_{sys}\cdot \Delta S_{sys}

This is derived from the same equations we used previously by assuming that T_{surr} = T_{sys} - as both are in the same environment - and the knowledge that q_{surr} = -q_{sys} (because energy leaving the system goes to the surroundings). This allows the following to occur;

\Delta S_{univ} = \Delta S_{sys} + \frac{q_{surr}}{T_{surr}} = \Delta S_{sys} - \frac{q_{sys}}{T_{sys}}

And by multiplying this through by -Tsys we gain the following;

-\Delta S_{univ}\cdot T_{sys} = q_{sys} - \Delta S_{sys}\cdot\Delta T_{sys} = \Delta H_{sys} - \Delta S_{sys}\cdot \Delta T_{sys} = \Delta G_{sys}

Which gives us the Gibbs free energy. If this is lower than 0 then it follows that the reaction can is thermodynamically favourable and will occur spontaneously as there must be a decrease in the overall energy in the system for the reaction to be spontaneous at a specific temperature. For example, we can show the fusion point of Ammonia again;

\Delta G = -5653 - (195.4\cdot -28.93) = 0Jmol^{-1}

As \Delta G is 0Jmol^{-1} we know that this is the fusion point - if T_{sys} was any lower then \Delta G would be negative and it would freeze (given that these are the enthalpy change of freezing and the entropy change of freezing).

Heat Death of the Universe

Given that \Delta S_{univ} is always positive it follows that the amount of entropy, or disorder, in the universe is always increasing. This leads to a theory regarding how the universe will end because the increasing entropy implies that as time goes by the universe will be reaching its thermodynamic equilibrium and will reach its maximum entropy - at the moment it is not in equilibrium as it requires more entropy. At this point there will be no more free energy, as needed for spontaneous reactions and as such no more reactions will be able to progress, and as such the universe will die.

For an example of this, imagine you have a room into which you spray an aerosol. The fumes from the aerosol will diffuse into the room and you will eventually reach a point at which the aerosol has fully diffused and so there is no longer an area with a large quantity of aerosol and another without. In the same way, eventually it is theorised that the universe will eventually balance out and reach an equilibrium point at which point there it cannot do anything more.


Entropy is one of the major ideas from Thermodynamics - it is in essence, the 2nd Law of Thermodynamics, and therefore is very important for both physics and chemistry. Entropy plays a part in cosmology as shown in the section on black holes, and it could be a cause for the end of the universe - as the universe reaches thermodynamic equilibrium and no more free energy remains. In terms of chemistry entropy is very useful for calculating whether or not a reaction will occur.


NIST National Institute of Standards and Technology

ChemWiki Standard Enthalpy Changes of Formation\%3A_Standard_Thermodynamic_Quantities_for_Chemical_Substances_at_25\%C2\%B0C

ChemWiki Calculating Entropy Changes

4College Entropy and the Second Law of Thermodynamics

EntropyLaw Entropy and the Second Law of Thermodynamics

LiveScience What is the Second Law of Thermodynamics? Jim Lucas May 22,

HyperPhysics Second
Law: Entropy

HyperPhysics Entropy as Time’s Arrow

Texas A&M University Thermodynamics: Gibbs Free Energy

Bodner Research Web Gibbs Free Energy

UCDavis ChemWiki Gibbs Free Energy

HyperPhysics Gibbs Free Energy

How Chemical Reactions Happen James Keeler, Peter Wothers March 27, 2003

HyperPhysics Gibbs Free Energy and the Spontaneity of Chemical Reactions

UCDavis ChemWiki Second Law of Thermodynamics

A Larger Estimate of the Entropy of the Universe Chas A. Egan, Charles H. Lineweaver

PhysLink What exactly is the Heat Death of the Universe?

AskAMathematician After the Heat Death of the Universe will anything ever happen again?


On the Thermodynamic Equilibrium in the Universe F. Zwicky