Concepts of Combinatorial Optimization, Volume 1
eBook - ePub

Concepts of Combinatorial Optimization, Volume 1

  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

Concepts of Combinatorial Optimization, Volume 1

Book details
Book preview
Table of contents
Citations

About This Book

Combinatorial optimization is a multidisciplinary scientific area, lying in the interface of three major scientific domains: mathematics, theoretical computer science and management.

The three volumes of the Combinatorial Optimization series aims to cover a wide range of topics in this area. These topics also deal with fundamental notions and approaches as with several classical applications of combinatorial optimization.

Concepts of Combinatorial Optimization, is divided into three parts:

  • On the complexity of combinatorial optimization problems, that presents basics about worst-case and randomized complexity;
  • Classical solution methods, that presents the two most-known methods for solving hard combinatorial optimization problems, that are Branch-and-Bound and Dynamic Programming;
  • Elements from mathematical programming, that presents fundamentals from mathematical programming based methods that are in the heart of Operations Research since the origins of this field.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Concepts of Combinatorial Optimization, Volume 1 by Vangelis Th. Paschos, Vangelis Th. Paschos in PDF and/or ePUB format, as well as other popular books in Mathematics & Discrete Mathematics. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Wiley-ISTE
Year
2012
ISBN
9781118600238
Edition
1

PART I

Complexity of Combinatorial Optimization Problems

Chapter 1

Basic Concepts in Algorithms and Complexity Theory 1

1.1. Algorithmic complexity

In algorithmic theory, a problem is a general question to which we wish to find an answer. This question usually has parameters or variables the values of which have yet to be determined. A problem is posed by giving a list of these parameters as well as the properties to which the answer must conform. An instance of a problem is obtained by giving explicit values to each of the parameters of the instanced problem.
An algorithm is a sequence of elementary operations (variable affectation, tests, forks, etc.) that, when given an instance of a problem as input, gives the solution of this problem as output after execution of the final operation.
The two most important parameters for measuring the quality of an algorithm are: its execution time and the memory space that it uses. The first parameter is expressed in terms of the number of instructions necessary to run the algorithm. The use of the number of instructions as a unit of time is justified by the fact that the same program will use the same number of instructions on two different machines but the time taken will vary, depending on the respective speeds of the machines. We generally consider that an instruction equates to an elementary operation, for example an assignment, a test, an addition, a multiplication, a trace, etc. What we call the complexity in time or simply the complexity of an algorithm gives us an indication of the time it will take to solve a problem of a given size. In reality this is a function that associates an order of magnitude1 of the number of instructions necessary for the solution of a given problem with the size of an instance of that problem. The second parameter corresponds to the number of memory units used by the algorithm to solve a problem. The complexity in space is a function that associates an order of magnitude of the number of memory units used for the operations necessary for the solution of a given problem with the size of an instance of that problem.
There are several sets of hypotheses concerning the “standard configuration” that we use as a basis for measuring the complexity of an algorithm. The most commonly used framework is the one known as “worst-case”. Here, the complexity of an algorithm is the number of operations carried out on the instance that represents the worst configuration, amongst those of a fixed size, for its execution; this is the framework used in most of this book. However, it is not the only framework for analyzing the complexity of an algorithm. Another framework often used is “average analysis”. This kind of analysis consists of finding, for a fixed size (of the instance) n, the average execution time of an algorithm on all the instances of size n; we assume that for this analysis the probability of each instance occurring follows a specific distribution pattern. More often than not, this distribution pattern is considered to be uniform. There are three main reasons for the worst-case analysis being used more often than the average analysis. The first is psychological: the worst-case result tells us for certain that the algorithm being analyzed can never have a level of complexity higher than that shown by this analysis; in other words, the result we have obtained gives us an upper bound on the complexity of our algorithm. The second reason is mathematical: results from a worst-case analysis are often easier to obtain than those from an average analysis, which very often requires mathematical tools and more complex analysis. The third reason is “analysis portability”: the validity of an average analysis is limited by the assumptions made about the distribution pattern of the instances; if the assumptions change, then the original analysis is no longer valid.

1.2. Problem complexity

The definition of the complexity of an algorithm can be easily transposed to problems. Informally, the complexity of a problem is equal to the complexity of the best algorithm that solves it (this definition is valid independently of which framework we use).
Let us take a size n and a function f(n). Thus:
– TIMEf(n) is the class of problems for w...

Table of contents

  1. Cover
  2. Title Page
  3. Copyright
  4. Preface
  5. Part I: Complexity of Combinatorial Optimization Problems
  6. Part II: Classical Solution Methods
  7. Part III Elements from Mathematical Programming
  8. List of Authors
  9. Index
  10. Summary of Volume 2 Paradigms of Combinatorial Optimization
  11. Summary of Volume 3 Applications of Combinatorial Optimization