Does the New Scientific Evidence about the Origin of Life Put an End to Darwinian Evolution?/Program 2
|By: Dr. Stephen Meyer; ©2011|
|As scientific technology has progressed, scientists have realized the cell is more and more complex. According to Microsoft’s Bill Gates, the DNA within each cell is far more complex than any computer software ever created.|
Today, the most important questions of life: Where did we come from? How did we get here? What brought us into existence. Charles Darwin in his Origin of Species, admitted that he did not know how the first cell came into existence, but speculated that somehow a few simple chemicals combined and the first primitive cell emerged from the primordial waters of the early earth. But today Darwin’s evolutionary assumptions are being challenged by molecular biologists, as scientists have discovered that the human cell is not simple, but complex beyond belief. One tiny cell is a microminiaturized factory containing thousands of exquisitely designed pieces of intricate molecular machinery, made up of more than one hundred-thousand million atoms. In the nucleus of each cell is the DNA molecule which contains a storehouse of 3 billion characters of precise information in digital code. This code instructs the cell how to build complex-shaped molecules called proteins that do all the work, so that the cell can stay alive. Where did this precise information in DNA come from? Is it the product of purely undirected natural forces? Or is it the product of an Intelligent Designer? Bill Gates, of Microsoft, has said, “Human DNA is like a computer program, but far, far more advanced than any we’ve ever created.” Today, you’ll learn why the digital code embedded in DNA in the human cell, is compelling evidence of an Intelligent Designer. My guest is Dr. Stephen Meyer, co-founder of the Intelligent Design Movement in the world; graduated with his PhD degree in the Philosophy of Science from Cambridge University. We invite you to join us.
- Ankerberg: when did life originate, where did life originate, how did life originate? And in answering that question you’ve got to say, where did the first cell come from? My guest is Philosopher of Science, Dr. Stephen Meyer, who’s written this best-selling book Signature in the Cell: DNA and the Evidence for Intelligent Design. And in recent days, the discoveries about DNA in the cell, it has formed a very complex picture, so complex that it has led you to posit that the only place it could come from is from an Intelligent Designer. Talk a little bit about that and what you’re finding in DNA.
- Meyer: Well, as we talked about in our last program, DNA encodes digital information. It’s a form of digital nanotechnology. And the information in the DNA directs the construction of the proteins and protein machines that are necessary for the cells to stay alive. And this, I think, has reframed the ancient question about design, because in the 19th century with Darwin’s theory, it was thought that he had explained away all evidence of design. And now we have a striking appearance of design that has no, at this point, naturalistic explanation.
- And I think there’s also a positive reason to think that it was designed by an intelligence, but it’s interesting I think for our purposes, at this point, just to acknowledge that this is a huge mystery. I call it the DNA enigma. Richard Dawkins, who denies that there’s any evidence of design, nonetheless has to acknowledge that DNA is full of digital code. He says that “the machine code of the genes is uncannily computer-like.” And Bill Gates goes on to say that “DNA is like a software program, only much more complex than any we have ever created.” So, what we’re dealing with inside the cell is information. And that a raises a really profound mystery, because we know we need information to build all the important components of the cell. So, if you want to explain the origin of the first cell, the origin of life, you have to account for the origin of information.
- And there’s a German scientist named Bernd-Olaf Küppers who acknowledged this connection early on. He said that the question of the origin of life is basically equivalent to the question to the origin of biological information. That makes sense too, because I used to ask my students the question, if you want to give your computer a new function, what do you have to give it? And they would know right away, you’ve got to give it code, information, software. And the same thing is true in life; there is a connection between information and new form and function. And if you want to get a functional living organism off the ground, you’ve got to have that information. So that’s the DNA enigma, that’s the big mystery that has to be solved.
- Ankerberg: And it’s a lot of information, it’s not just a few things, it’s a lot of things in a lot of different parts that are all programmed to work together. It’s very complex.
- Meyer: It’s lines and lines of code. I mean, in the human genome you’ve got three billion characters; in lower animals, somewhat less, but it’s an amazing amount of information in every living organism.
- Ankerberg: Now, folks, we have trillions of cells in our bodies, okay. Each one of them have DNA in it; and DNA is doing a specific function in the cell that’s fantastic. And I want you to describe it for the folks the best that you can.
- Meyer: Well, DNA is special for two reasons. We’re aware of its amazingly beautiful double helix structure. We see the double helix, it’s almost an icon now. In culture we see it on news programs about criminal investigations. But the thing that people typically overlook about DNA is that it’s chock full of information; in fact, information that’s stored in a digital form. So it’s very much like a software program, in that you have very specific arrangements of chemicals along the spine in the DNA that function just like alphabetic characters in the written language or digital zeros and ones in a machine code. So it’s information carrying capacity or information storage capacity that’s really important.
- Now the information in DNA ends up directing the construction of proteins and protein machines. When we were together in the last program, I used the analogy of what goes on at the Boeing plant up in Seattle where I live, where engineers will use digital information to direct the construction of mechanical parts. For example, an airplane wing: where you place the rivets will all be controlled by machinery that’s being directed by digital information that’s flowing down the line. That’s exactly what’s going on inside the cell; you’ve got digital code directing the construction of mechanical parts. And that raises a huge question, which is, where does all that information in the cell come from? We know where it comes from at Boeing, it’s designed by engineers, but where does it come from inside the cell?
- Ankerberg: Talk a little bit more about what that information is like. It’s not just complex information, it’s specific information.
- Meyer: Yeah, very good. Let me give just a little bit of a science lesson here. There are two definitions of information that engineers and scientists and ordinary people are familiar with. In the engineering world, there is something called “Shannon information”, or it’s a mathematical formula that describes how much information can be carried along an information channel, so it’s sometimes called information carrying capacity. And the mathematical definition of information only captures a part of what we typically mean by information. It just refers to the improbability of a sequence of characters. The more improbable an arrangement of characters, the more information is being conveyed.
- Ankerberg: Yeah, it’s like an alphabet where you just have gibberish; you got a lot of letters, but it doesn’t do anything.
- Meyer: So, look for example, on this slide you have at the top, the I, U, I, N, S, K, the gibberish string is an information carrying channel. It could be arranged to convey information, but we don’t know whether it’s functional or non-functional. We don’t know whether it’s giving, conveying, any meaning. But we can still calculate how improbable that exact string is. And that’s all the mathematical theory of information does, it calculates the information carrying capacity, but it doesn’t tell us whether the information is functional or specifically arranged to convey a function.
- Ankerberg: Yeah, DNA has the second one.
- Meyer: It has the second type of information. And here I use an English analogy with “Tide and time wait for no man.” A clearly functional string of characters that is conveying information that is meaningful, okay?
- Now, there’s some terminology that goes with this. The top string is said to be complex, but it’s not said to be specified and complex. In mathematical parlance, complexity and improbability are the same idea. So the more improbable, the more complex something is. But DNA isn’t just complex; it’s not just an improbable arrangement of characters, it’s specified and complex, just like English language or computer code is specified and complex.
- So when we talk about the DNA enigma, the mystery surrounding the origin of the information you need to build the first cell, we’re not just talking about information in the mathematical sense of improbability, like the first line in the slide to my right, instead we’re talking about information that is specified: the arrangement of characters matters to the function the string of characters performs. And that’s what we have in DNA. And Francis Crick made that point very early on when he said… He was aware of the mathematical theory of information. He said when we’re talking about information in DNA, we’re not talking about merely mathematical information, merely Shannon information, as it’s called by engineers, we’re talking about specified information or functional information. And here he says, information means here the precise determination of the sequence, either of the bases in the DNA or the amino acids in the proteins. Those subunits of those important molecules have to be specifically arranged to perform the jobs that they do inside cells.
- Ankerberg: Alright, Stephen, people in the audience are probably asking the question, so what that’s it arranged precisely? So what that the sequence is there? If it wasn’t there, what would that mean? The cell would be dead, basically, wouldn’t it?
- Meyer: Well, right, because the information in the DNA molecule provides the instructions for building new proteins, and the proteins are the toolbox of the cell. They do all the important jobs to keep the cell alive. So if you have the wrong sequence of bases in DNA, you won’t build a functional protein. When we were together last time I used the visual aid, I’ll bring it up again if you don’t mind. It gets it across. These are snap-lock blocks, but I use them to illustrate how proteins are constructed. Proteins are long, chain-like molecules; they’re made of smaller chemical subunits called amino acids. Each of the beads here represents one of the 20 different types of amino acids. If DNA sends a signal, sends information that tells the cell the sequence in which it is to attach one amino acid after another; if the amino acids are attached properly, then the chain will fold into a three-dimensional structure that will have a hand-in-glove fit with other molecules in the cell and will allow the chain to do the job it needs to do. So, if you get the sequence right of the bases on the DNA, then you’ll get the correct sequence of amino acids, which will cause it to fold properly and which will enable it to perform a function.
- Ankerberg: Alright, you’ve discovered all of that, which brings up the question, where in the world did this information come from? How did it get there in the first place, because the information, as we’re going to see, is absolutely enormous and complex. I mean you can have 150 of those amino acids in just a regular protein.
- Meyer: That’s just a short protein.
- Ankerberg: You could have 1,000 or more in the bigger proteins and they’ve all got to be arranged in the right order. So what are the theories that are being proposed here to answer the question: How did this start off?
- Meyer: This is the question that fascinated me, because I became intrigued with what I call the DNA enigma. The enigma is not what does DNA do, we know that. But the question is: Where did the information that DNA stores, where does that information come from in the first place? And there have been a number of completely naturalistic or materialistic theories that have been proposed. And they’ve been kind of nicely summarized by a French scientist. His name is Jacques Monod. He wrote a book in 1968; he was one of the colleagues of Francis Crick, who was the co-discoverer of DNA. And Monod wrote a book saying that if you’re going to be a scientist and you want to explain something, you’ve got some basic approaches to explanation that you need to try, or to stick to. One is to rely on chance, on random variations of some kind. Another is to rely on what he called necessity, which is a kind of a scientific code-word for relying on natural laws: if I drop a ball and it falls to the earth, scientists would say it falls by necessity, according to the law of gravity. And Monod said there’s also a third approach, which combines chance variations with law-like processes of necessity. That’s kind of what Darwin did in combining natural selection and random variations. And he said that’s an acceptable scientific approach too. So, if you’re addressing the question of the origin of life from a naturalistic or materialistic point of view, you want to explain the origin of life by either chance, necessity or the combination of the two.
- And as I investigated the DNA enigma, I kind of followed that logic and said, okay, how have those different approaches to explaining the origin of information fared? Have they been successful in accounting for this great mystery; or is the mystery even deeper than we maybe realize? And I came to the latter conclusion.
- Ankerberg: Yeah. How many of you listening heard that life arose by chance, or by chance and natural selection, alright? If you heard that and you’re still persuaded of that, we’re going to take a break, when we come back, we’re going to unscramble that, and we’re going to show you, what are the chances that it could happen by chance? Alright? Stick with us and we’ll be right back.
- Ankerberg: Alright, we’re back. We’re talking with Dr. Stephen Meyer, a philosopher of science who’s written a best-selling book. Here it is, Signature in the Cell: DNA and the Evidence for Intelligent Design. And we’re answering the question, how did life originate? Where did the first cell come from? And now we’re talking about, what are the naturalistic theories that have been proposed to answer this question? Where did the specified information in the DNA molecule come from? How did it get there? And one of deals is “by chance,” and you’ve got a quote there that kind of tells where the folks are at on this.
- Meyer: This was from a famous biochemistry textbook. It was written in the ’70s and it kind of summarized what a lot of people had in mind. It’s from Albert Lehninger, the author. He says, “We now come to the critical moment in evolution in which the first semblance of life appeared through the chance association of a number of abiotically formed macromolecular components.”
- Ankerberg: What does that mean?
- Meyer: Well, in English, it just means that all those parts of the DNA molecule, that we were seeing in the previous segment, and the parts of the protein arrange themselves by chance into the informational arrangements, the informational sequences, that are necessary for those molecules to perform the functions they do in cells. I wouldn’t say it was a full-blown approach, it was kind of assumed by a lot of scientists early on that chance would solve the problem. I’ve actually been criticized in the book for taking that proposal too seriously. Most scientists now believe that chance is completely inadequate for explaining the origin of the information you need to build life.
- Ankerberg: Alright. So, let’s talk about this thing. Why is it that it couldn’t have happened by chance?
- Meyer: Well, you can never say never for sure, okay. You can’t say something couldn’t for sure; but you can say that it would be vastly more improbable than not, that life arose by chance. And, so improbable in fact, that people would dismiss it as being not a credible hypothesis, alright?
- Ankerberg: I think, the folks, when they hear what you’re going to say…
- Meyer: Yeah, let me explain why. Here is an illustration I used to use with my students. I’ve got an old bag here of Scrabble letters. So I ask my students to test the hypothesis that chance is an effective way of building new information, of generating new information. What I’d have them do is, I’d have them walk up to, or I’d walk out into the aisles, have them pick out letters at random, and then take those letters and go and write whatever letter they chose on the blackboard and have them do it in the sequence that they…
- Ankerberg: …picked out of the bag.
- Meyer: …in which they chose the letters. And invariably what would turn up on the blackboard would be some gibberish like Z S U A E T, you know, whatever, an obviously improbable arrangement of characters, but not specified to perform a function, not meaningful, okay? And that’s the problem with chance: it generates unspecified arrangements, but not specified arrangements.
- Ankerberg: Yeah, you got gibberish vs. time and tide wait for no man.
- Meyer: Exactly, and chance does a great job of giving you the gibberish, but it will not, it does not produce information. Now, occasionally, we’d have a situation in which we’d get a student, you know, maybe the first three students would come up with something like, like ben or, you know, or um, or something that was at least word-like. And the students would often start to kind of do cat-calls, and say ah-ha, we got you, we’re going to produce a lot of information by chance. But, I would always win the point, win the argument, by allowing the experiment to keep going. And eventually the gibberish would completely swamp any hint of meaning that was in the sequence.
- Ankerberg: Yeah, if you had 150 letters going down the line, the fact is, you’d have gibberish.
- Meyer: Yeah, absolutely. And there’s a reason for that, and I’ve got a little demonstration on that point as well. It’s called the problem of combinatorials. And this is what people don’t realize. Before we get to the slide, I have a visual aid, okay. It’s a little dinosaur puzzle and it’s got four dials. On each dial there are six possibilities: one for each of six dinosaurs. And you’ve got a dial for the head, the torso, the tail and the label of each dinosaur. And the idea here in the puzzle is you’re trying to turn it to get the Tyrannosaurus Rex head, body, tail and label, all to line up.
- Now, what are the odds of doing that by just turning the dials at random? That’s the problem of combinatorials; there’s lots of different combinations, so the odds are actually very small. You’ve got a one in six chance of getting the correct head; and then a one in six chance of getting the correct body, tail and label. And you might want to say the odds are six plus six plus six plus six plus six. But that’s not how it works, because you have to take into account all the different combinations that can be formed. If I’ve got a head on one line, one in six possibility here, but on the second dial for the torso, I’ve also got a one in six possibility. So there’s actually six TIMES six possibilities. And then when you get to the third dial, you’ve got another times six, and the fourth dial, another times six. It ends up that you actually have 1296 possible combinations. So the odds of getting the correct one are only one in 1296. Now, if I, say, give you 10 seconds and you get to turn the dial three times, is it more likely or less likely than not, that you will stumble onto the correct combination? Well, it’s obviously less likely than not. And it’s not impossible, but it’s far more likely that you won’t solve the problem the chance, than you will solve the problem the chance.
- Ankerberg: But there, you’re just starting to show us your combination here.
- Meyer: Right. And this is the same problem that applies to the origin of life. If you want, I’ve got one more illustration: you got a bike-lock, got 10 now, instead of six possibilities on each dial, you’ve got 10, and you’ve got four possibilities. Now, you’ve got 10,000 possible combinations. What if you had a bike-lock with 10 dials? Now you’ve got 10 x 10 x 10 x 10: 1010 possibilities or 10 trillion combinations. If you’re a thief and you’re trying to steal a bike and it’s got that kind of a lock on it, you’re going to be spinning dials until the cows come home and you’re never going to hit the right combination. There’s just too many combinations to search. The odds are always going to against you. It’s always going to be more likely than not, that you won’t solve the problem by chance. Vastly more.
- Now, in the case of proteins, it’s even much worse, because you have 20 possibilities at each site, and you don’t just have 10 sites: for a very modest short protein, of say 150 amino acids, you’ve got enormous number combinations. So, you’ve got,…this is a short protein, 150 amino acids long, you’ve got 20 x 20 x 20 x 20150 converting that into base 10, that’s one chance in 1095 of getting a specific sequence. Now, in my book, I go into this in more detail. There’s a lot of…
- Ankerberg: What’s that number like?
- Meyer: Well, there’s only 1080 elementary particles in the whole universe. There’s only been 1017 seconds since the origin of the universe, assuming a 13 billion year old universe. This is an unimaginably large space. So the best way to get a handle on it is to imagine something like we’re looking for a needle in a haystack; only the haystack is the size of the galaxy or maybe the universe, and the needle is somewhere out there and you’ve got 10 seconds to look. And it’s such a small fraction of time in relation to all those possibilities, that the odds are still like the guy, the thief with the bike-lock, or me with the dinosaur puzzle, it’s always vastly more likely than not, that you’re going to solve the problem by chance.
- Ankerberg: Alright, Steve, what do you want our audience to come away with from this program? What’s the bottom line?
- Meyer: Well, the bottom line is that chance is not a plausible explanation for the origin of the DNA enigma, the origin of the information necessary to build the first life. In the book I make a very precise calculation that takes into account a number of different factors, more than we’ve been able to discuss here. But the number I come up with shows that it’s always more likely that life did not arise by chance, than it did. Which is another way of saying there is always a better explanation than the chance explanation. And so, maybe next time we can look at some of those other proposed explanations. But universally within the field of origin of life studies, scientists have rejected the idea that chance alone produced the first life.
- Ankerberg: Alright, folks, we’re just getting started. Next week, we’re going to turn to the question, when this information was discovered, and it’s even getting more complex than what we showed them right here. Those who did not want to take the Intelligent Design side, had to come up with a naturalistic theory of how this information got there at the beginning. And we’ve already talked about chance. And what you’re saying is that all of the scientists now have rejected chance. What have they turned to? What are the next two theories? And we will talk about that next week. Folks, you won’t want to miss it.