You are here

An Interview with Florian Potra

Florian A. Potra earned a Ph.D. in Mathematics from the University of Bucharest, Romania. After an Andrew Mellon Postdoctoral Fellowship at the University of Pittsburgh, he joined the faculty of the University of Iowa, first as an Associate Professor of Mathematics, and then as a Professor of Mathematics and Computer Science. In 1997-1998, he served as a Program Director in Applied and Computational Mathematics at the National Science Foundation. Since 1998 he has been a Professor of Mathematics and Statistics at the University of Maryland, Baltimore County. He is also a Faculty Appointee at the Mathematical and Computational Sciences Division of the National Institute of Standards and Technology.

Dr. Potra has published more than 120 research papers in prestigious professional journals. He is the Regional Editor for the Americas of the journal Optimization Methods and Software, and serves on the editorial board of three other well-known mathematical journals.

Ivars Peterson: When did you first become interested in mathematics?

Florian Potra: When I was a child, my grandfather used to give me mathematical puzzles. He was proud that I could solve some of them. I was able to do all sorts of arithmetical operations in my head. But then I went to school, and my elementary school teacher managed to turn me against math.

As a teenager I was mainly interested in literature, poetry. I wanted to become a writer, which was pretty difficult in a communist country. At the time, there was an atmosphere of relaxation. These were the first years of [Nicloae] CeauÅŸescu in power, and many people were optimistic. After several years, however, he cracked down on all the freedoms of expression.

My father was quite skeptical about a possible career in literature; he said that science was a much safer bet. As it happened, [the government] opened a special class for those talented and gifted in mathematics. They chose about 30 students from all over Transylvania. We had a special class, a very advanced curriculum—abstract algebra, axiomatic geometry, calculus. On top of that, we had people from universities giving lectures. And the math teacher was excellent. He gave me several books. One of them was Norbert Wiener’s autobiography I Am a Mathematician.
 
Somehow by reading that book, I decided to become a mathematician. I saw that mathematicians do not necessarily have to be boring. I then studied for my entry exam. That was pretty rigorous in Romania. I managed to get a perfect score. I got into the Babes-Bolyai University, one of the best universities in southeastern Europe. The name commemorates two brilliant Transylvanians, the Romanian physician Victor Babes and the Hungarian mathematician János Bolyai.

Students in the special math high school class had access to the Institute for Scientific Computing (Institutul de Calcul) of the Romanian Academy. They had their own computers, built there, and we learned to do some machine language programming. That was in the 60s, and it was interesting. I kept up a long relationship with the director of that institute, the Academician Tiberiu Popoviciu. He was about 60 when I met him as a high school student. Later he was my Master’s thesis advisor.

IP: What was the topic of your Master’s thesis?

FP: Because of my admiration for Tiberiu Popoviciu, I combined some good classical analysis with some numerical analysis, and a little bit of functional analysis, but that was superficial. Seeing my interest in functional analysis, my advisor told me, “If you want to learn functional analysis, go to Bucharest. There is a young guy there who’s a genius: Ciprian Foias. See if he is willing to take you as a Ph.D. student.”

I went there, and I told Foias: “I am a student of Tiberiu Popoviciu, and he advised me to study with you for a Ph.D.” Foias was hesitant. After some discussion, Foias said, “Come to my place. I’ll give you some problems. If you can solve them, then I’ll take you on.” That’s how I became his student.

I started to work with him. But he defected in 1978 at the International Congress of Mathematicians in Helsinki. Here I was, with my thesis almost written and no Ph.D. advisor. Moreover, I was put on the blacklist because, if your advisor defects, there is a question mark about yourself. I survived. I finished my Ph.D. thesis with Constantin Apostol, a great operator theorist.

I wrote my thesis mainly on functional analysis, but I always liked the idea of applications. Again, Norbert Wiener was my hero. He had no problem moving from the most abstract subjects to the most applied subjects. My advisor Ciprian Foias was considered a pure mathematician—specializing in functional analysis. He was the creator of the Romanian school of operator theory. But then he later got the Norbert Wiener Prize in Applied Mathematics [1995].

In 1982, I got an Andrew Mellon Postdoctoral Fellowship at the University of Pittsburgh, and I forgot to go back [to Romania]. At Pittsburgh, two people influenced me a lot. One was my mentor Werner Rheinboldt, a very well-read individual—history, literature, mathematics. He was an exceptional numerical analyst. The other was Pesi Masani. I had met him in Romania. Together with Rheinboldt, he was instrumental in arranging for the postdoctoral fellowship.

Pesi Masani used to talk for hours about cybernetics, Wiener, his philosophy. Fascinated by Wiener, just as I was, Pesi Masani got his Ph.D. at Harvard in 1946. He was from India. He worked with Wiener a lot in the 50s. They wrote a number of very good joint papers. And I used the Wiener-Masani prediction theory in my own research.

Werner Rheinboldt told me after my postdoc was over and I got an offer from the University of Iowa, “If you go to Iowa and you want to do applied math, then do it applied. Go and talk to engineers.” So I did. I was very fortunate to meet an exceptional mechanical engineer, Ed Haug. Last year, he got the D’Alembert Award [American Society of Mechanical Engineers]. And a chemical engineer, Greg Carmichael.

I worked on the National Advanced Driving Simulator, which was the largest project of its kind at the time. The idea is that you want to test drive a car before you build it. The simulator is housed in a huge building. The car model you want to drive is in the computer.  Changing the car model means changing some parameters in the equations. The driver sits in a sort of capsule that moves on a platform having six degrees of freedom. It moves longitudinally on a rail and can then also go laterally. You have the clear illusion that you are driving a car. And you have 3D graphics all around.

The challenge here was to solve the equations of motion in real time. And the equations of motion were mixed differential algebraic equations (DAEs). In theory, you can show that a solution exists, but the engineer wants the numerical solution, not an existence proof. And he wants it in real time because you need to provide the simulator with a new position and velocity each, as I recall, 5 milliseconds in order to create the sensation that you are really driving a car.

So, the mission was to solve the equations in 5 milliseconds. We had an Alliant FX/8—a minisupercomputer. It had eight vector processors. Extremely good people implemented the integration algorithm on it. With Ed Haug, I did some of the theory and some of the numerical experiments.

From there to robotics is a small step in the sense that the equations of motion of a robot are mixed differential algebraic equations because you have mechanical systems that are connected by rods, by links, by all kinds of joints: translational joints, revolute joints, composite joints, and universal joints. All these joints can be represented as equality constraints (bilateral constraints).  A robot also comes into contact with objects. Then you have to consider the contact problem. If you want to solve it fast, you have to make the simplified hypothesis that the objects are rigid, for example. Rigid bodies cannot interpenetrate, and this leads to inequality constraints (unilateral constraints).  It turns out that classical mechanics and this assumption of rigidity are not compatible.

There is a simple example, which is more than 100 years old, that shows that such a system may not have a solution in the classical sense. So now you try to find an approximate numerical solution of an equation that may not have a solution in a classical sense. This is a difficult problem.

It turns out that the right approach was found by Jean Jacques Moreau. He thought about setting it up as a differential inclusion rather than a differential equation. At the time, I had a very good Ph.D. student, Mihai Anitescu. He is now at the Argonne National Laboratory. We came up with a discrete model that always has a solution. Our discrete model was general enough to be able to simulate all kinds of mechanical systems with contact and friction. We implemented the discrete model, and numerical simulations were very close to  results from real experiments.

This was called a time-stepping scheme. Moreau himself had a time-stepping scheme, and some others had them, too. But we proved that under all conditions, our scheme had a solution.  At each time step we needed the numerical solution of a linear complementarity problem, non-monotone because of the existence of friction. In essence, we had a discrete model that worked.

Later on, with Mihai Anitescu and another former graduate student of mine, Bogdan Gavrea, we managed to prove that as you take the discretization parameter to zero, the solution of the discrete model converges indeed to the solution of a measured differential inclusion. This was done before us by David Stewart, but he did it just in the particular case when only unilateral constraints are present. Stewart actually used our model to prove this wonderful result. A short version of his result was published in Comptes Rendus because with this he had solved a 100-year-old problem—Painlevé’s paradox, published in the same journal. And he wrote a wonderful review paper in SIAM Reviews about the simulation of multibody systems, with contact and friction. And robots are such systems.

In order to have robots that are highly intelligent in the sense that they can adapt to new tasks very easily, we have to have very good and efficient solvers for such problems. So the state of the art now is that if you want to build an industrial robot, you train that robot for a year to do a job, and it does it perfectly, but by the time it does it perfectly the market changes and that product is no longer useful, and it’s very difficult to train the robot to do something else. We could eventually describe the task that it has by such equations but for the time being we cannot solve them fast enough. And there are some other problems involving such diverse mathematical fields as dynamical system theory, combinatorics, optimization, statistical learning theory, algebraic and differential topology.

What I want to say is that there are some mathematical problems that still have to be solved in order to have truly intelligent and easily adaptable robots to different tasks on a large scale. New advances are also needed in artificial intelligence—you want to be sure that the robot can think for itself to a certain degree, not only to follow a script but also have creativity.

IP: How did you end up at the University of Maryland, Baltimore County?

FP: I was asked to serve as a Program Director in applied and computational math at the National Science Foundation. I thought that staying one or two years in Washington would be a good opportunity to meet some other people, and so on. My wife came with me, and she didn’t want to go back. She liked the [Washington] area very much.

At UMBC, we have a small but very good applied math department. I also work one day per week at NIST. There are a lot of opportunities in the Washington area. Last year, I took advantage of such an opportunity. I was at Georgetown University as a Royden B. Davis Chair for Interdisciplinary Studies, a visiting position. I had a wonderful experience there.

IP: What is your current research?

FP: I continue to try to prove new results on these mixed differential algebraic inequalities. But I also do a lot of work in optimization. I think that in the 21st century optimization is going to be extremely important. Because of the progress in solving nonlinear PDEs quickly and because we have very good computers, I think that now we can do more than simulation. We could determine the parameters of a system governed by PDEs, for example, so that it performs optimally. This is known as PDE-constrained optimization, and I think it has a great future.

I have done some work at NIST with a well-known structural engineer there, Emil Simiu, trying to optimize structures subjected to multiple hazards, for example hurricanes and earthquakes. Two papers appeared on this topic last year and another one has just been published. This study is still at the beginning, but the results are encouraging.  In general the unknowns are the shape of the structure, the configuration of the structural system, the member cross sections, and so forth. With such a variety of unknowns the difficulties in solving the optimization problem can be insuperable. But suppose you decide that some aspects of the design problem are known, say the general shape of the structure. How do you then design a structure so that it meets performance requirements optimally while being subjected to all kinds of hazards?

And these days you want to consider also the carbon footprint. The cost is not just the cost of materials and labor. Concrete, for example, needs a lot of energy to be produced. Optimization can be performed by taking into account the need to reduce the consumption of both embodied and operating energy. If the reduction of the carbon footprint is a specified requirement, then you have a really complex problem. Optimization can help engineers achieve more resilient structures with a considerably reduced carbon footprint.

IP: Your talk has a fair amount of history in it.

FP: I’ve always been interested in history. In high school, I had the idea that I was going to become a writer, so I had to read a lot. For this talk, I went several times to the Library of Congress to look for some things that are not readily available.

IP: Is that something you would pursue more as you go on?

FP: I’ve always wanted to write a book on the history of some parts of mathematics or about some mathematicians. For example, Pesi Masani wrote probably the best book on Wiener. I would eventually like to write a book about mathematics and robotics, but I’m so caught up in research. Writing a book means having a long sabbatical or retiring. Pesi wrote his book about Wiener after retiring. Hopefully, 10 years from now, I’ll be ready to retire and write a book.

The problem is that I never learned English properly. As a child, I spoke Romanian, Hungarian, and German, because I was born in Cluj. I had friends speaking those languages. In high school, I had French and Russian, which I learned very well. But I never studied English. I just picked it up by myself. I also picked up some Italian, Spanish, and Brazilian Portuguese.  But I intend to be a good scholar of English. When I write a book, I want to write it in English rather than French.

Read about Florian Potra's Carriage House Lecture: "Mathematics and Robotics

 Lecture Podcast(mp3)

id: 
3987
News Date: 
Tuesday, April 13, 2010