Whose Complexity Is It Anyway?

Βρήκες ή ψάχνεις κάτι ενδιαφέρον για τους τομείς των Μαθηματικών, της Φυσικής ή της Πληροφορικής; Για πέρνα να τα πούμε...

Συντονιστές: kostas213, markelos, Tulis

Απάντηση
Άβαταρ μέλους
antony07
Forum Moderator
Forum Moderator
Δημοσιεύσεις: 1673
Εγγραφή: Τετ Νοέμ 15, 2006 4:37 pm
Real Name: Αντώνης
Gender: Male
Facebook ID: 0
Τοποθεσία: Uncertain (by principle)
Επικοινωνία:

Whose Complexity Is It Anyway?

Δημοσίευση από antony07 »

Αναδημοσιεύω δύο εξαιρετικά άρθρα του Richard Lipton:
Whose Complexity Is It Anyway?

The three meanings of complexity

Steven Nixon is an expert on national security policy. He has been recognized for distinguished work in the various fields of engineering, intelligence, Congressional staff advising, and space policy. He has worked for various government agencies in the past, and has now founded his own firm, Steven Nixon Consulting.

Today I want to report on a discussion of complexity theory—but not in our sense.

I just heard Nixon speak at a conference run by a group called TTI/Vanguard. They are a for-profit organization that organizes conferences at which their subscribers can interact with leading lights. They run five conferences each year on various aspects of advanced technology. It costs quite a bit to be able to go to even one of their meetings—an order of magnitude more than our meetings and sold in blocks of five—but my “seat” is currently paid for by the College of Computing, so I attend a few of them each year.

Their meetings are fun, with interesting speakers who give strong talks, and structured for lots of audience interaction. We are encouraged to jump in and ask questions—attendees have their own microphones, so it is not too difficult to ask whatever is on your mind. They do not publish proceedings—mainly you have to be there.

This meeting’s focus was on “taming complexity.” As a theorist that is something that I would like to be able to do—is this why it’s called the Complexity Zoo? But, as you might have guessed, their notion of complexity is not our notion of complexity. This caused me think about the various uses of the word “complexity,” and what they mean. I could think of at least three types of complexity.

Three Roads to Complexity

Two major technical fields I know have areas that call themselves “complexity”—ours and dynamical systems. Yet the meaning used at the conference and discussed directly by Nixon was neither of these. Instead it reflects the way we use “complex” in ordinary speech: big, involved, hard-to-describe, requiring much effort even to comprehend. It included a cool notion that I think you may like called wicked problems, which I will define and discuss in a moment. This is different from our notion of a problem being wickedly hard—it has to do with framing the problem itself.

One of best quotes of the meeting was from Nixon’s talk, where he quoted the late President Dwight Eisenhower as once saying:
"If a problem cannot be solved, enlarge it."

I wonder how we could use that to attack some of our problems. What does that mean for P=NP?

Note that already we use problem in two senses: P=NP is a research problem, and it is about particular computational problems such as SAT. The former easily enlarges itself—with a few special exceptions, we do not even know how to prove super-linear lower bounds on time or circuit size for problems within the possible range of feasibility. What I’m asking is whether we should enlarge our notion of computational problem, with due formalism including what it means to be complex. First let’s try to categorize the three meanings of complexity.

Complexity: Resource Based

When we say “complexity” we almost always are referring to the resources needed to compute and solve some problem. The resources measured can be: time, space, randomness, nondeterminism, rounds, quantum operations, or other resources. Indeed sometimes we mix these together to get measures that limit two or more resources.

In our complexity theory we like two types of results: upper bounds and lower bounds. An upper, of course, shows that some problem X can be solved in a certain resource bound. A lower shows that some problem X cannot be solved in a certain resource bound. For example, one of the great upper bounds is that we can sort numbers in at most comparisons, and the corresponding lower bound that comparisons are needed.

It is a shame that we cannot prove more such results, especially more lower bounds. But as we have talked before, many times, showing a lower bound is very difficult. It amounts to showing that there is no way to be “clever” and avoid some “obviously” required computations. Even the sorting result must be examined carefully. The lower bound relies on the fact that only comparisons are allowed between numbers. It is possible to sort faster than this if one can violate the model and do more than just comparisons.

A good example of this is the bead sort created by Joshua Arulanandham, Cristian Calude and Michael Dinneen in 2002—see here. There are potential hardware implementations that allow this method to run in linear time, but they seem not to be practical.

Initially the integers to be sorted are represented as numbers of beads. The beads for each integer occupy a row, and importantly, are left-justified in that row (diagrams from Wikipedia).
Εικόνα
After the “fall” the poles contain the sorted order. The surprising point is that the individual beads making up each integer have been scrambled, but owing to the left-justification, one can prove that each integer is preserved in the final output.

I raise this example just to remind us how hard it is to envision all possible ways that can be used to tackle a problem—that is why lower bounds are so difficult.
Εικόνα
Complexity: Behavior Based

A Google search for the word complexity hits another notion of complexity first. It finds the notion of “a complex system.” We are not far behind, but we are not first. When people think of complexity, not theorists, they usually think of systems that look like this:
Εικόνα
When many say complexity theory, they usually are referring to the type of behavior of a system. This type of complexity theory is all about behavior, about prediction, about chaos, about the tremendous forking of paths (bifurcations) that can arise from even simple systems that evolve over time.

I wish, sometimes, that they had called their area something else other than complexity theory. But there is not much we can do about it now, since their use is well established and well funded. For example, the the Sante Fe Institute has the motto:
"Complexity theory research expanding the boundaries of science."
Great motto—wish we had a cool one like this. Oh well.

There are many potential connections between our notion of resource based complexity and their behavior based complexity theory. Some of them may come through the theory of complexity over the reals and other infinite fields, aspects of which we covered here. I will leave more on those for another time.

Complexity: Real-World Based

One of the foundational assumptions that I believe is bedrock, solid, unchangeable, cannot question, is that theory has made progress by its way of formulating problems as objects of analysis in themselves. I will not budge on this issue. But Nixon’s talk recently made me question the related assumption this this manner of stating problems encompasses all the ones we can rigorously attack.

The key is the notion of a wicked problem (WP). It is not that easy to define what they are, but here is the standard list of properties they have:
  • The problem is not understood until after the formulation of a solution.
  • Solutions to wicked problems are not right or wrong.
  • Every wicked problem is essentially novel and unique.
  • Every solution to a wicked problem is a “one shot operation.”
  • Wicked problems have no given alternative solutions.
The notion of a WP is not new. In 1967 West Churchman introduced the concept of wicked problems in a journal article, where he discussed the moral responsibility of Operations Research

The notion of a WP is not new. In 1967 West Churchman introduced the concept of wicked problems in a journal article, where he discussed the moral responsibility of Operations Research

"... to inform the manager in what respect our `solutions’ have failed to tame his wicked problems. "

Horst Rittel and Melvin Webber formally described the concept of wicked problems later in 1973 where they contrasted WP’s from “trivial” problems such as those from mathematics, chess, or puzzle solving. Ouch. I believe that these problems are pretty “wicked,” but I do agree that they are not wicked in the sense used by Churchman, Rittel, and Webber.

I like the idea that some problems are inherently wicked. That they do not have precise statements of what is the problem. Therefore they may not have solutions, at least not in the usual sense. I wonder if it is possible to help develop a theory of WP’s? To develop a theory that addresses problems that are not well stated. Does this even make sense, or is this an impossible task?

I think it is not. Perhaps the start of a model could be based on algorithmic game theory. Imagine that a group of players have conflicting notions of what they want for a solution. Could notions from game theory help make sense of this? Is the very beauty of the the idea of a Nash Equilibrium a type of solution to a problem that at one time appeared to be wicked?

Open Problems

Can we help and formalize the notion of a wicked problem? What would such a theory look like?

Richard J. Lipton
Source
"Ωραία παιδιά κατάχαμα κυλάει
το πιο ωραίο ρόδο απ' το στεφάνι σας
αδράξτε κάθε τι που προσπερνάει
μα αν σε βιτρίνα εμπρός βρεθεί η χάρη σας
ή σε γκισέ, φυλάξτε το τομάρι σας
θυμάστε, Colin de Cayeux τον λέγανε
το άσυλο εμπιστεύτηκε ναι σαν κι εσάς,
σημάδεψε ο μπάτσος και τον ξέκανε
"
Απάντηση

Επιστροφή στο “Ζητήματα Μαθηματικών - Φυσικής - Πληροφορικής”