Questions tagged [karush-kuhn-tucker]

In mathematical optimization, the Karush–Kuhn–Tucker (KKT) conditions are first order necessary conditions for a solution in nonlinear programming to be optimal, provided that some regularity conditions are satisfied.

In mathematical optimization, the Karush–Kuhn–Tucker (KKT) conditions, also known as the Kuhn–Tucker conditions, are first-order necessary conditions for a solution in nonlinear programming to be optimal, provided that some regularity conditions are satisfied. Allowing inequality constraints, the KKT approach to nonlinear programming generalizes the method of Lagrange multipliers, which allows only equality constraints. The system of equations and inequalities corresponding to the KKT conditions is usually not solved directly, except in the few special cases where a closed-form solution can be derived analytically. In general, many optimization algorithms can be interpreted as methods for numerically solving the KKT system of equations and inequalities.

The KKT conditions were originally named after Harold W. Kuhn, and Albert W. Tucker, who first published the conditions in 1951. Later scholars discovered that the necessary conditions for this problem had been stated by William Karush in his master's thesis in 1939.

The KKT conditions include stationarity, primal feasibility, dual feasibility, and complementary slackness.

546 questions
312
votes
8 answers

Please explain the intuition behind the dual problem in optimization.

I've studied convex optimization pretty carefully, but don't feel that I have yet "grokked" the dual problem. Here are some questions I would like to understand more deeply/clearly/simply: How would somebody think of the dual problem? What…
32
votes
4 answers

KKT and Slater's condition

I was studying Stephen Boyd's textbook and got confused in the KKT part. The book says the following: For any convex optimization problem with differentiable objective and constraint function, any points that satisfy the KKT conditions are primal…
27
votes
2 answers

What is the intuition behind Slater's condition in optimization? (And other constraint qualifications.)

I would like to "grok" Slater's condition and other constraint qualification conditions in optimization. Slater's condition is only one of many different constraint qualifications in the optimization literature. Which one is the most fundamental? …
19
votes
2 answers

Is KKT conditions necessary and sufficient for any convex problems?

In Boyd's Convex Optimization, pp. 243, for any optimization problem ... for which strong duality obtains, any pair of primal and dual optimal points must satisfy the KKT conditions i.e. $\mathrm{strong ~ duality} \implies \mathrm{KKT ~ is ~…
16
votes
0 answers

A puzzling KKT for LMI vs. scalar constraint

I am trying to understand the KKT conditions for LMI constraints in order to solve my original question in KKT conditions for $\max \log \det(X)$ with LMI constraints. In the meantime, I found a much simpler problem that does not go through when…
14
votes
1 answer

Help me organize these concepts — KKT conditions and dual problem

This is a long question in which I explain my current understanding of certain ideas. If anyone is interested in reading this and would like to provide any commentary/feedback that may help me understand these ideas more clearly, or that you think…
13
votes
1 answer

Simple explanation of lagrange multipliers with multiple constraints

I'm studying support vector machines and in the process I've bumped into lagrange multipliers with multiple constraints and Karush–Kuhn–Tucker conditions. I've been trying to study the subject, but still can't get a good enough grasp on the subject.…
10
votes
4 answers

Minimize $-\sum\limits_{i=1}^n \ln(\alpha_i +x_i)$

While solving PhD entrance exams I have faced the following problem: Minimize the function $f(x)=- \sum_{i=1}^n \ln(\alpha_i +x_i)$ for fixed $\alpha_i >0$ under the conditions: $\sum_{i=1}^n x_i =1$ and $x_i \ge0$. I was trying to use KKT…
10
votes
1 answer

How do we determining which constraints are active in KKT?

Suppose there is a constrained optimization problem having inequality constraints. We can solve it using Karush-Kuhn-Tucker conditions. My question is how do we determine which constraints are active and which are inactive? I read it in a KKT post1,…
9
votes
2 answers

Understanding Karush-Kuhn-Tucker conditions

Suppose we may want to use the K–T conditions to find the optimal solution to: \begin{array}{cc} \max & (\text { or } \min ) z=f\left(x_{1}, x_{2}, \ldots, x_{n}\right) \\ \text { s.t. } & g_{1}\left(x_{1}, x_{2}, \ldots, x_{n}\right) \leq b_{1}…
9
votes
2 answers

Question about KKT conditions and strong duality

I am confused about the KKT conditions. I have seen similar questions asked here, but I think none of the questions/answers cleared up my confusion. In Boyd and Vandenberghe's Convex Optimization [Sec 5.5.3] , KKT is explained in the following…
9
votes
1 answer

How to solve this nonlinear constrained optimization problem

I have the following nonlinear optimization problem: $$ \begin{align*} \text{Find } x \text{ that maximizes } & \frac{1}{\|Ax\|} (Ax)^{\top} y \\ \text{Subject to } & \sum_{i=1}^n x_i = 1 \\ & x_i \geq 0 \; \forall \: i \in \{1\dots n\} \\ …
9
votes
1 answer

Big picture behind how to use KKT conditions for constrained optimization

What is the point of KKT conditions for constrained optimization? In other words, how is the best way to use them. I have seen examples in different contexts, but miss a short overview of the procedure, in like one or two sentences. Should we use…
8
votes
1 answer

Geometric Meaning of KKT Conditions

I am trying to state KKT conditions for an optimization with both inequality and equality constraints in geometric form, so that I get the big picture better. Is this what it says? At the optimal solution, the gradient of the function must be a…
7
votes
2 answers

When do constraint qualifications imply strong duality?

Assumptions / Conventions I am interested in problems of the form $$ \inf_{x \in \mathbb{R}^n} ~ f_0(x) ~:~ f_i(x) \leq 0 ~\forall ~i \in [m], ~~ g_i(x) = 0 ~\forall ~i \in [k]$$ where $f_i$ are convex and differentiable, and $g_i$ are affine. I…
1
2 3
36 37