Amino acids

I am fascinated by the concept of entropy. Somewhere else you might read something serious explaining entropy in physics. I don’t know if I understand such explanation. But there it is for you to munch. Here, I am talking about it in the physical sense, but then mix it with other concepts and explain my fascination. Meaning I will be jumping all over the place, from physics to wherever.

I was one of those who was taught that entropy is a tendency to disorder. Later on I learned that entropy is one of those things you have to put into an chemical equation to calculate Gibbs’ free energy and then know whether a reaction will proceed more to one side than to the other. Here I was left with a feeling of just one of those unexplainable things. Entropy was like a magical, unsatisfactory, term. Something you have to know with little hope of understanding. Then comes protein structure and I get a first physical thing I could match to the concept and give it a less magical flavour. I was reading how the amino-acid Alanine stabilizes an alpha-helix because its side chain, being small, loses less entropy when in the protein folded state. Bingo! A big side chain, say that of Tryptophan, can freely move to many positions in an unfolded protein. Thus, if it had to be fixed into limited range of motion within a structure, then it would lose a lot of entropy. There you have it. A physical basis for understanding entropy in this realm of science.

Other encounters with entropy lead me to read about entropy and information. With my other experiences under my belt, this was not too hard, and beautifully allowed me to think of the physical positioning as information, with entropy being the freer state. I also read about entropy as a consequence of random movement (a “probability thing” I would say, for lack of better vocabulary). We would see “disorder” increasing because the probability for, say, all molecules of a gas to come together to a little corner of a glass are so small, we will never see that happening. In other words, it is not about “disorder.” It is about more probable states. This, of curse, matches with the entrapped side-chain of a big amino-acid. The misconception of entropy as “disorder” washed away of my brain, but I could understand why the concept would be explained that way. A metaphor that, if well-intended, did not make justice to the concept.

One good day too, I was reading a book by non-other than Julio Cortázar. Julito was talking about poets, and mentioned a few names, ending the list with “lo demás es entropía” (everything else is entropy). It clicked so beautifully! Pure genius! (Of course pure genius, this was Julio Cortázar!) I just fell in love with the concept of entropy and the possibility of using it outside of the natural sciences. Powerful concept my friends. For instance, it makes it easier to explain why we people buy into pseudoscience and charlatanry. It is much easier to buy into charlatanry than to think by yourself, find the facts, use some logic and dismantle the quackery. It is also much easier to produce charlatanry than to produce educative material. Thus abundance of misinformation can be taken for granted. Such abundance makes it more probable for people to find the charlatanry than anything else. Very long explanation, right? Well, we can just say that it is entropically favourable to buy into pseudoscience. Thus, it is a fight to be fought forever. Pseudoscience will not disappear. We can inform people, educate people, but once we get there, we have to keep at it or entropy will take its course. Almost anything can be easily explained as entropy. Now, you might see that I translated entropy into unsensical (yes “un-” because I mean without sense, not against sense), or random, noise, into whatever is easier (more probable) to happen. Looks a bit like Murphy’s Law, doesn’t it?

Of course, I am fascinated, but not mystified. It is amazing that such little single concept can be applied so broadly, but this is no magic. Rather, no other magic than language and science converging into a powerful and beautiful concept mixing probability with everything else.

Easy does it baby.


2 thoughts on “Entropy

  1. Your probably wondering what in the heck I was refrring to about the bird. I thought it was a replacement for the missing Alanine and Tryptophan graphic – which I mistakenly remembered as aspertane, and have probably misspelled.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s