blog post

2016: The Start of a New… Decade? Part 2

In 2005, Science magazine published their 125th anniversary issue and posed 25 of the biggest questions “facing science over the next quarter-century”.1 In my previous post, I talked about how, ten years later, the Center for Sustainable Nanotechnology is tackling one of those questions: “How Far Can We Push Chemical Self-Assembly?”2 In today’s post, I’m going to introduce some of the CSN’s work on another big question from Science 2005: “What are the limits of conventional computing?”3

25 year questions

Science‘s 125th anniversary issue in 2005 raised questions “for the next quarter century” that remain relevant in 2016.  (nanotube image from Wikimedia)

What Are the Limits of Conventional Computing?

When people think about chemistry experiments, they often think of Bunsen burners and flasks, microscopes and lasers. Sustainable Nano readers probably also think of cells and nanoparticles! But another type of research in the CSN focuses on using computers instead of (or in addition to) all those “traditional” tools of chemistry. One of the ways the CSN will push the limits of computational modeling efforts is in supporting and leading the design of new experiments.

chemistry tools

Tools of the chemistry trade… Today may be Bunsen Burner Day, but this post is all about computers!   (images L-R from Pixabay, Pixabay, & Wikimedia

Conventional computing functions in a similar way to how the CSN and other research centers and groups operate: it tackles challenging problems by dividing the work and collaborating. Some of the world’s largest supercomputers can have as many as several hundred thousand processors (although a single user can only access a few thousand of those processors at a time). A problem is divided into sets of calculations or tasks that can be carried out simultaneously across many processors. These calculations are said to be running “in parallel” (hence the term “parallel computing”).

Grad student Ian Gunsolus explained the limitations of current computing capabilities very well in this previous post. It is not computationally feasible (at least not yet) to model the atomic-level interactions between nanoparticles and large biomolecular systems like proteins and membranes. There are just too many atoms moving according to too many different variables (for example, what the distances between an atom and its neighbors are, whether an atom is bonded to its neighbors or not, what the charges of the atoms are, etc.). The motions of atoms are calculated femtoseconds at a time (1015 femtoseconds = 1 second). Biological processes can span nanoseconds or longer (1 nanosecond = 1 million femtoseconds).

Let’s say we want to model a system of charged gold nanoparticles and a protein in a solution of water and ions, all adding up to a million atoms. If we want to simulate this system for one second, the number of calculations a computer has to run will be on the order of a million (atoms) * the number of variables * 1015 femtoseconds! Given that most real experiments take a lot longer than one second, saying that the largest supercomputer using all of its processors would have a little trouble with this problem is an understatement.

supercomputer

Stampede, a supercomputer at UT Austin, that many research groups across the country (including some at the CSN) have access to through NSF’s XSEDE program. (image credit: Texas Advanced Computing Center)

So how can we simplify problems like this down to a manageable size? Within the limits of computing, one strategy for the CSN will be to remove nonessential atoms and variables from our models of biomolecules and nanoparticles. We can do this at each stage in the nanoparticles’ life cycle successively (and successfully!). The aim will be to determine the mechanism of a nanoparticle’s toxicity: from its attachment to the cell membrane to changes in the structure and dynamics of the membrane itself, and to other associated physiological changes that may lead to cell death.

Keep following us to see the types of computational models we develop and our progress in both simulation and experiments as we try to answer Science (and Science)’s big questions.


EDUCATIONAL RESOURCES


REFERENCES

  1. Kennedy, D. & Norman, C. What don’t we know? Science, 2005, 309 (5731), 75. doi: 10.1126/science.309.5731.75
  2. Service, R. How Far Can We Push Chemical Self-Assembly? Science 2005, 309(5731), 95. doi: 10.1126/science.309.5731.95
  3. Seife, C. What Are the Limits of Conventional Computing? Science 2005, 309(5731), 96. doi: 10.1126/science.309.5731.96