Christine Peterson of the Foresight Institute on Nanotechnology
Christine Peterson writes, lectures, and briefs the media on coming powerful technologies, especially nanotechnology. She is Founder and Vice President, Public Policy, of Foresight Nanotech Institute, the leading nanotech public interest group. Foresight educates the public, technical community, and policymakers on nanotechnology and its long-term effects.
She serves on the Advisory Board of the International Council on Nanotechnology, the Editorial Advisory Board of NASA’s Nanotech Briefs, and on California’s Blue Ribbon Task Force on Nanotechnology.
In this interview, Peterson describes a variety of approaches to enhancing responsibility in innovation, including participation in professional societies, understanding the special distinctions between disciplines, and analyzing the ethos of software developers.
Jeff Ubois: What strikes me about nanotech is how poorly it’s understood, or how many different ways it is described. Could we start by describing some of the misconceptions about nanotechnology?
Chris Peterson: Misconceptions about nano. Well, there are dumb simple misconceptions, and I can go through those if you want, just so that you know what they are. But they’re really boring to me, so I tend to skip them. So, if you want the dumb simple ones, you have to say, “I want the dumb simple ones.”
Jeff Ubois: No, let’s go for the ones that are more dangerous because they’re more subtle.
Chris Peterson: Well, because they’re more true, more interesting. First of all, I regard myself as an environmentalist, that’s why I’m in nanotechnology. Since then, of course, I’ve seen many other applications, medicine especially, but that’s why I got into this.
Now some environmental groups are taking a very balanced approach to nano, or they are saying, “Gosh, this looks like it could be really, really exciting technology for clean manufacturing, cleaning things up.” Others are taking kind of a knee-jerk approach.
One thing that’s kind of a subtle point–and this is something that Europe, in particular, has to watch out for–is the idea that this time, we’re going to get it right with nano, like we have not gotten it right before, in terms of regulation of new technology and safety.
Now, what’s wrong with that? Of course, we all want to get it right. However, if what you do is say, “We’re going to give all the old technologies, a pass, and we’re going to focus only on the new ones” and you say, “No, we’re going to do it right on this one.” Well, okay, so the new ones are probably cleaner, but what are you doing? You’re slowing them down and maybe blocking them because they’re not 100 percent clean and letting the old stuff (which is dirty) continue. That’s what I see happening in the US. Now, Europe is a little different. They have something called REACH, have you heard about this?
Jeff Ubois: I don’t know it.
Chris Peterson: This is an EU-wide thing. It’s a massive effort to try to address safety and environment issues and health issues in the chemical industry. In that sense, what Europe is doing is they’re saying “Let’s not give the old stuff a pass. We’re not giving anything a pass, we’re going to be clean with chemicals.” And presumably, REACH will apply to nanotech, as well. In that sense, they are trying to “go back and get it right” on the old stuff as well.
On the other hand, maybe I’m naïve, but they’re going to spend an immense amount of money on this, because what they’re going to do is require testing of all the old chemicals–I mean everything. Now, as somebody who believes technology can be a good thing and can improve, maybe we should at least consider the possibility that a smarter use of that money for safety and health and environment would be to say, “Why don’t we take that money and put it in to R&D for clean technologies?” Wouldn’t that make more sense, rather than saying, “No, let’s assume that the old technologies are going to last for a long time.”
I don’t have the answers. But I’m not sure they asked the question over there. And over here in the US we are not taking that approach yet. I think our chemical regulations need improvement. And even though I’m a big technology fan, I can look at it and say, “Wow, we got some issues here. It is not clean enough.”
Jeff Ubois: It has been reported that there are on the order 500 chemicals that are in our bodies that were not in the bodies of our ancestors.
Chris Peterson: Yes. I think Europe tends to go too far in one direction, in terms of saying, “No, no, let’s regulate the heck out of everything,” rather than moving forward. Asia goes the other way: “Let’s move forward at any environmental or health cost.” China is, I think, the best example of that. And the US is kind of in the middle. I’d like to see chemicals and nano treated in a uniform fashion.
Jeff Ubois: The perception is that the risks of chemicals in use are, in some ways, more known or more possible to assess than technologies that are still being developed or that haven’t been deployed. And therefore, the risk profile on the nano is very different.
Chris Peterson: It is absolutely true that we tend to know more about old stuff, and less about new stuff. And we should work very hard–this is actually something for Europe and Italy to pay attention to.
The US made a big mistake in its national nanotechnology initiative by not setting aside anything for environmental health testing. Social and ethical discussions about [nanotechnology] were funded though.
In the US, we tend to regulate products. Our regulation of processes is weaker. Our product regulation is stronger. So, I can see how the people who pushed the bill through thought, “It is not the job of the federal government to do this.”
However, because our regulatory system on processes is so weak–not non-existent, just not good–you can see how the industry’s saying, “Cool, let’s go, go, go.” And then they rush ahead, and then they find out, “Oh wait a minute. Now you’re going to institute new kinds of environmentl testing? You’re going to come up with a whole regulatory apparatus for nano and we don’t know yet what it’s going to be?”
So, as a business, how would you like to run a business in that situation? When you don’t know what the rules are going to be. You can’t predict, you can’t know what the testing’s going to be. You don’t know what your liability will be. Your insurance company doesn’t know what your liability will be. Basically, everybody’s just going, “Oh my God, what is this? They’re going to hold us to higher standards? They’re clearly going to hold us to different standards and we know that. And we’re supposed to predict it?”
Jeff Ubois: Well, it would seem to depend on how do you define the technology. Why can’t someone say, “Well, we’re just part of the chemical industry”?
Chris Peterson: That is what they’re doing so far. However, as you are probably aware, they’re finding that some of these materials do operate differently in the body, in the environment. They act differently. For example, nano particles, as compared to single molecules, appear to have a route–an unusual new route, possibly–into the brain, passing the blood-brain barrier.What that means, though, is that many people in the US and Europe and Asia are all noticing this and saying, “Hmmm, we’re going to have to require some new testing.” And we don’t have a clue what those tests are going to look like yet.
Now, some people are starting to come up with proposals. The Foresight Institute is a member of the ICON, the International Council on Nanotechnology. This issue is also being addressed by the insurance companies. Swiss Reinsurance has funded, and so has the U.S. State Department. And the U.S. Environmental Protection Agency has also put money into the IRGC, the International Risks Governance Council, based in Zurich.
Jeff Ubois: I’d love to hear more about that.
Chris Peterson: They did a great job. The International Risk Governance Council devoted half its report and half the time of this conference I went to, to the long-term stuff–the nano systems and nano devices. That’s when you get the real issues — the new weapons, the nano devices that repair your cells, nano surveillance, nano toxicity.
Jeff Ubois: So that all means thinking about the process of innovation, and how it progresses from discovery to application, or looking at the way nanotechnology innovations have diffused so far. When you look at something that’s really pure theory, or in a contained lab of some kind, and then you look at the other end, like the stain-proof pants, there’s quite a range. How do you at the Foresight Institute think about watching those different stages of innovation or watching the diffusion of nanotech innovation? And if you were going to think about useful policy interventions at the different stages–
Chris Peterson: Well, many, many stages. The first one is the funding of the R&D. There’s basic very deep things like your R&D infrastructure, your scientific infrastructure, your education system, whether you let immigrants in based on H1B visas. Like, if they get a Ph.D., do they get to stay? All those kinds of issues. And then there’s even broader issues, like how does your society look at immigrants? Are you welcoming, do you integrate them, whatever? Those are very basic fundamental societal issues, and very challenging issues.
And from there, you come into the extremely important issue of intellectual property–what happens to the rights on that research? A huge controversial issue, as far as I’m concerned.
Jeff Ubois: The guy who invented software patents is now at Berkeley and he says he regrets that. All the bio people are coming to him and saying, “How could we do this?” And what he says is, “Think twice.”
Chris Peterson: Exactly. Don’t do it this way. Intellectual property is a critical, critical issue. And ifIf that’s the only thing you come back to them about on this one thing is saying, “Whatever you do, look at this, fix this, try to do well with this,” that’s a big win.
Jeff Ubois: That really speaks to the concept of sustainability in innovation, which is another thing we’re trying to understand in a deeper way–making innovation sustainable in multiple ways – economically, environmentally, socially…
Chris Peterson: You know, I’ll throw out something somebody needs to consider someday–and this is a very radical thing, I don’t know how I feel about it, I mean, I’m very nervous about it. But here’s the problem: how do people make a living in the long term, when there’s this encroachment of automation? And here’s the thing: You’re asking a populace, let’s say the Italian populace, to pay taxes now to invent technologies that are going to put them out of work.
Now, what about that? I am not a Bolshevik or anything like that, but I think there’s an issue here. Citizens are making an investment. How is that investment paying off for them personally? You could say, “Here’s how we do it in the US,”–and it’s a positive answer–“the way it works for [the populace] is that the intellectual property rights go to the universities, and these people are going to send their kids to the universities and the kids will benefit and blah, blah, blah.” That works in the near term, and it also works for the folks who send their kids to the universities, which not everybody does. So what about the other folks? How do we give them a cut of the pie?
Jeff Ubois: They’re the first ones to get automated out of a job, anyway.
Chris Peterson: If we’re asking these folks to put up the cash to develop new technologies, we have to find a way to give them a cut of the action. I think it’d be fun to have a conference or workshop asking, “How would you do that? How do we turn these people into owners?” Because otherwise, you’re going to end up with dispossessed people, unhappy people, and underclass and that’s not right. They made an investment. It wasn’t a voluntary investment, but they made an investment, it’s legitimate, it’s paying off. And somehow, we screwed them over. They’re not getting a cut of that action. I’m exaggerating the case to make a point here ….
Jeff Ubois: This also speaks to a deeper issue of public/private partnerships that runs through all of academia now, from the pharma deals where some bit of intellectual property is being developed or being given from the public to the private sector to open access publishing….
Jeff Ubois: I really like the way you started with intellectual property and then went up to the next level, to the investment around that. But you were talking about innovation from the chalkboard to products in daily use.
Chris Peterson: Okay, so we did intellectual property, and then there’s the general business climate. There’s a really interesting policy thing you can do, which is how do you stimulate the use of nanotechnologies for fundamental human needs that may not have big markets or the people may be poor or need water or these things that you and I take for granted.
Governments don’t seem to be doing this very well, but sometimes, people say, “Well, why don’t they post an order, you know, say that if you create this, we guarantee that we will buy a million pieces of it at this price. If you can do it at this price to meet these specs, we’ll buy a million a year.” But [governments] don’t do that very much. They also don’t do prizes very much. And there’s a reason for that, which has to do with budgeting issues, it’s hard.
So yes, there’s this public policy question of we want to help deal with poverty, we want to deal with orphan diseases and these kinds of things. So how do we direct technology innovation toward things like that, instead of new golf balls? Actually, the stain-resistant pants, people make fun of them, but they probably reduce laundering which is probably good for the environment. But golf balls are sort of my favorite. Is that really what we want to do? Not that there’s anything wrong with it, it’s just not exciting, it’s not a public policy, new golf balls.
Jeff Ubois: So, where the research dollars get directed and how are problems you’d highlight as things that people should focus on?
Chris Peterson: I think what the government tries to do–and one thing that infuriates the activists who don’t like technology is that when technology, R&D spending by governments is being introduced, one of the things on the list for “why we should do this” is because it’ll help the poor. And these activists just say, “That is just bullshit. That just doesn’t work, that is not what’s going on here.”
And in a way they’re right. We shouldn’t use that as an excuse. I think some day, yes, some day, many of these technologies will help the poor. But if you want to help the poor now, which is what these activists want to do–they say, “Not 20 years from now, not 30 years from now, we have people who need clean water right now. There’s all this stuff they need. We can do it. It’s not happening.” They’re right, you know, they’re right.
So, if you want to help the poor, probably we shouldn’t use it as an excuse to fund R&D.
Jeff Ubois: How do you think individuals involved in the R&D process can have a personal effect in the world? Is there some set of responsibilities that they can take on at an individual level? Imagine that you’ve been successful in a research grant application, now, what are the ethical obligations that you might consider yourself under? What are the ways in which you might amplify whatever ethical values you think need to be considered?
Chris Peterson: That’s a tricky question, because whistleblowing doesn’t work.
Let’s say you’re in a lab, and you feel that carbon nanotubes are not being handled in a safe fashion. You can either go to the media with that, which is probably going to cost you your career. Or you can participate in the International Council on Nanotechnology (ICON), which just produced the first study of industry practices.
My guess is that the professional societies in almost any technical field have probably played a role of enabling individuals to find allies in a kind of safe, protected way, exerting pressure on academia, on industry in positive ways, at least to balance the huge pressure of profits. And most of these professional societies have ethical codes.
But I think if you really want to know how create a sense of responsibility, look at the software development community. Talk about political activism…
Jeff Ubois: Yes.
Chris Peterson: They see their work as political. They see it as ethics-based. They think of the ethical consequences of their decisions. They’re very politicized and very aware. So, why is that? Why is that true in software and not so much true in other areas?
Jeff Ubois: Well, I think many consider programming as akin to speech.
Chris Peterson: Someone should do that study to see some of these practices in software may be transferable. Many of them inherently are not transferable to other fields. Some of them may be transferable to other fields and maybe we should think about how to do that.
I think it’s multiple reasons. You named one, reason, but I think there’s others. And then how do you spread that culture? It’s possible that culture will spread inherently, because of the software being more in everything, right? The day will come when you can’t do anything without software involvement. You won’t be able to do a damn thing.
Jeff Ubois: I was going to say, you could imagine it going both ways. You may not be able to read a book without being subject to some Digital Rights Management scheme.
Chris Peterson: Maybe. But let me give you a scenario to help you understand how strong the software culture is:
Let’s say the government of the United States decides that the Internet is going to work differently now in the US. We’re going to make it so the President can delete websites. Let’s say he has the legal power to do it, that Congress passed the law. You know what? It still wouldn’t happen. It literally wouldn’t happen. Even if some programmers went along with it, they would be blocked by others. And you know what? Either they would be clearly blocked or just told “It couldn’t work, it just takes so long…”
There’s a powerful culture there. You cannot make them do things they don’t want to do. And there’s not a lot of fields where you could say that. So something important is going on there. We should figure out what it is.
Jeff Ubois: Are there elements in the public dialogue about nanotechnology that could be improved? Are there mechanisms in interacting with the media or interacting in public that would improve the public dialogue around nano?
Chris Peterson: There’s one thing coming that’s scaring me in the US, and I don’t know so much about Europe. But here there’s this pressure for what they’re calling public participation, which sounds good, right? Who could object to public participation?
Jeff Ubois: Well, it depends on the kind of participation…there has been a lot of press about the Dinosaur Park in Ohio, where even if you have a Ph.D. in astrophysics, you sign a pledge that you believe in the seven-day creation as described in the Bible if you want to work there. This place is within a six-hour drive with two-thirds of the population of America, and so the public come and see Adam and Eve wander around with dinosaurs, because evolution is just a controversial theory….
Chris Peterson: How embarrassing. But there is increasing pressure to have public participation. But, the people who organize these events, most of them have an agenda. What they have is they have an anti-technology agenda, and they may not even know it in a way, do you know what I mean?
It’s just that’s why they want to do “public participation.” They don’t like what’s going on and they want to stop it, or they want to change it. So, one thing that Italy and Europe should watch out for is, if you’re going to do public participation, if you think you really want to do that, somehow you have to get the bias out of the process.
Jeff Ubois: How do you get beyond something like “I know how I feel, I’m an expert in how I feel. You can’t argue with how I feel, and I feel this is bad, so there!”
Chris Peterson: It’s very, very, very difficult to do. The NNI paid for a conference on public participation and I went to it. And most of the people who got up and spoke had these agendas. I would say there were a few who got up and seemed to be more even-handed and didn’t seem to have an agenda. But, they were in the minority.
So something to be concerned about is this whole idea of government-paid-for, government-sponsored public participation. What are you going to get? I think you’re going to get whatever you pay for. You can get the public to say anything, depending on how you pick them and what you tell them up front and what you ask them.
Jeff Ubois: You can imagine a million industry front groups entering into that process, too.
Chris Peterson: Everybody, everybody would get in and be totally manipulated. So, good luck. I just think it’s a big can of worms. I’m not sure it makes any sense to try to do that.
Jeff Ubois: Yet there issues that are worth debating. Are there ways of bringing the debate up front that would produce a more useful outcome?
Chris Peterson: We have a representative democracy, and we have public participation through our representatives. And you’re right to grimace, but we haven’t found anything better. It only makes sense to change that if you do something better than that. As bad as the system is–and it is horrible and it’s got all kinds of problems–that’s public participation. Another problem is in the US, I don’t know about in Europe, but in the US, there’s complete scientific illiteracy. So what that means is you can tell people anything, and they’ll sort of believe it.
Jeff Ubois: Wow. So your take on the precautionary principle is?
Chris Peterson: Well, there’s many versions. The weak version says, “Be careful, use your common sense. Don’t do anything stupid.” Okay, fine. The strong principle says, “You can’t do anything unless you prove it’s safe up front,” and you can’t test it, you’re paralyzed. So somewhere in between is something reasonable. And when people use the term, you never know which version they’re talking about.
Jeff Ubois: A number of people are pushing for more cross-disciplinary approaches to innovation.
Chris Peterson: These social scientists at ASU are pushing what they call real-time technology analysis. They want social scientists involved during the research in all phases. They want them involved all the time, thinking about everything. And they’re going to try it. They’re working with the Biodesign Institute, ASU. But I think the thing to do is to get the researchers thinking. I don’t think you can insert social scientists and expect them to solve everything….
Jeff Ubois: Well, that’s what we’re trying to do. That’s a lot of what the Bassetti Foundation is about, is how do you reach into diverse scientific communities and flag the issue of responsibility or social consequences or…
Chris Peterson: Well, study the software community, figure out why the hell it is that they are so aware of these issues. So aware. And see how to move that over, or whether it can be moved over.
Jeff Ubois: That’s actually a good segue into the surveillance issue, which is one where you can clearly see major social impact by research in a pretty short period of time. And it’s a place where science and policy really interact.
Chris Peterson: Nano surveillance is coming really fast now. I’ll give an example. I talked to a Berkeley professor, a wonderful guy, developing a hand-held wireless device people would carry it around. And it would detect chemicals and send the data to a central location and would all be put together. His goal is to use it for pollutants and allergens. And to have the information be community-based.
Jeff Ubois: There’s a cancer cluster in some neighborhood, and now we know why…
Chris Peterson: Exactly. So it’s a cool idea, very interesting. But obviously, this exact same technology could be used for surveillance, could be not community-based, but centralized. He’s a very smart guy, I’m sure he’s very aware of that. And it’s being worked on right now. And you can imagine it’s not clear that the people carrying these things would be able to tell what’s being detected. How could they know?
It’s the David Brin thing, which is if we’re going to have all this information, who’s going to have it? Is it going to be that everyone has it? Is it going to be that only the police and government have it? Only companies? Who’s going to have it? And David Brin’s answer is, “It’s going to exist. You might as well have everybody have it because the other answers are worse.” People have a terrible time thinking about this. They don’t want to think about this whole issue.
Jeff Ubois: It’s certainly terrifying to think that power’s in the hands of people who are incompetent or corrupt or dangerous. So, I can understand the emotional resistance. What is the intellectual block that is stopping people from contemplating this particular issue?
Chris Peterson: Well, the one thing I find privacy advocates having trouble with, which is they don’t want to believe the future’s coming. These technologies are not only going to be in the hands of government, not only in the hands of corporations, these will be in the hands of individuals. The individuals are going to be collecting data. Now, how are you going to propose that they are not to show this data? That’s a free speech issue. How are you going to keep them from publishing this data? And I don’t think privacy advocates deeply internalized the fact.
If I were in their shoes, I would be saying, “Okay, this is coming. Let’s think now about who–let’s try to take a principled stance that’s going to last us through this transition.” But their attitude is let’s just fight every battle one by one. And back up step by step.
Jeff Ubois: Are there any final points that you wanted to make or you didn’t get a chance to make?
Chris Peterson: One thing that comes up is human enhancement. We should be aware of that. In terms of public, it’s a fun topic, it’s a sexy topic. People love to debate it. I think people who are trying to stop it are wasting their time. If people want to be enhanced, they’ll go to another country and get enhanced. What are they going to do, say they can’t come back?
I think there is one issue that does come up with regard to human enhancement. Again, an activist argument saying, “Should government funds be going for technologies that are heading in that direction, or should we, instead, invest our government funds somewhere else?” And that’s a legitimate question.
What’s not going to work anywhere is to try to tell rich people they don’t get what they want with their money. If people privately want to have plastic surgery, we permit that. We don’t say, “No, you can’t, you can’t have that. You don’t get a prettier nose.” It’s their money and if they don’t do that here, they’ll go somewhere else and do it. So, trying to stop rich people from doing what they want is not going to get anybody anywhere.
Jeff Ubois: There is a question about the first generation of kids to be enhanced — if you’ve got the 1.0 genetics enhancement, you know by the time you’re 25 your going to get passed by by the kids with the 2.0 version.
Chris Peterson: One of the compromises I’m thinking of making between the transhumanists and the Christians is, I’ve been already working on this, I’m trying to have a little summit going, which is the transhumanists agree to help the Christians stay the way they want to stay in the world in the future. They have the right. Nobody should force them. Transhumanists will defend their right not to be enhanced. And the deal is, the Christians will stop interfering with the transhumanists on the grounds that the transhumanists don’t enhance children. In other words, it’s something adults can do for themselves, but not for children.