For questions about the max entropy (also called Hartley entropy) or max-relative entropy functions.
Questions tagged [max-entropy]
19 questions
5
votes
0 answers
Connection between smooth max-relative entropy and smooth max-information
The max-relative entropy between two states is defined as
$$D_{\max }(\rho \| \sigma):=\log \min \{\lambda: \rho \leq \lambda \sigma\},$$
where $\rho\leq \sigma$ should be read as $\sigma - \rho$ is positive semidefinite. There is also a smoothed…
user1936752
- 3,383
- 1
- 9
- 24
5
votes
1 answer
Questions about the relation between max-relative entropy $D_{\max}(\rho||\sigma)$ and max-information
The max-relative entropy between two states is defined as
$$D_{\max }(\rho \| \sigma):=\log \min \{\lambda: \rho \leq \lambda \sigma\},$$
where $\rho\leq \sigma$ should be read as $\sigma - \rho$ is positive semidefinite. In other words, $D_{\max}$…
user1936752
- 3,383
- 1
- 9
- 24
4
votes
1 answer
What are explicit examples of smoothed conditional min(max) entropies?
Some general discussion of smoothed entropic quantities is found for example in Watrous notes, and an overview and discussion on its operational interpretations in (Koenig et al. 2008). It seems the quantity was introduced in (Renner and Wolf 2004),…
glS
- 27,670
- 7
- 39
- 126
4
votes
0 answers
Why are "smooth entropic quantities" useful/necessary?
Consider the $\epsilon$-smoothed relative max-entropy of $\rho$ with respect to $Q$, defined as (following Watrous' notation from these notes):
$$\mathrm D_{\rm max}^{\epsilon}(\rho\|Q) = \min_{\xi\in B_\epsilon(\rho)} \mathrm D_{\rm…
glS
- 27,670
- 7
- 39
- 126
4
votes
0 answers
Max-relative entropy quasi-convexity inequality under partial trace
The max-relative entropy between two states is defined as
$$D_{\max }(\rho \| \sigma):=\log \min \{\lambda: \rho \leq \lambda \sigma\}.$$
It is known that the max-relative entropy is quasi-convex. That is, for $\rho=\sum_{i \in I} p_{i} \rho_{i}$…
user1936752
- 3,383
- 1
- 9
- 24
4
votes
1 answer
Non-lockability of quantum max-entropy
Lockability and non-lockability are explained in this paper. A real valued function of a quantum state is called non-lockable if its value does not change by too much after discarding a subsystem. The max-entropy of a quantum state is defined…
user1936752
- 3,383
- 1
- 9
- 24
3
votes
1 answer
Which quantum entropies are meaningful with respect to continuous distributions of states?
When using a quantum channel to transmit classical information, we consider an ensemble $\mathcal{E} = \{(\rho_x, p(x))\}$ consisting of states $\rho_x$ labelled with a symbol $x$ from a finite alphabet $\Sigma$, each of which is associated with a…
forky40
- 8,168
- 2
- 13
- 33
3
votes
1 answer
Difference between min/max-entropies and the von Neumann entropy
Consider the (smooth) min-entropy, max-entropy and von Neumann entropy of a given density operator $\rho_A$. Does a small gap between $H_{\max(\min)}(A)_\rho$ and $H(A)_\rho$ implies a small gap between $H_{\min(\max)}(A)_\rho$ and $H(A)_\rho$? Put…
Shadumu
- 383
- 1
- 5
3
votes
1 answer
Quasi concavity of max-relative entropy?
The max-relative entropy between two states is defined as
$$D_{\max }(\rho \| \sigma):=\log \min \{\lambda: \rho \leq \lambda \sigma\}.$$
It is known that the max-relative entropy is quasi-convex. That is, for $\rho=\sum_{i \in I} p_{i} \rho_{i}$…
user1936752
- 3,383
- 1
- 9
- 24
3
votes
1 answer
Continuity of Renyi entropies - limiting cases
The Renyi entropies are defined as
$$S_{\alpha}(\rho)=\frac{1}{1-\alpha} \log \operatorname{Tr}\left(\rho^{\alpha}\right), \alpha \in(0,1) \cup(1, \infty)$$
It is claimed that this quantity is continuous i.e. for $\rho, \sigma$ close in trace…
user1936752
- 3,383
- 1
- 9
- 24
3
votes
1 answer
Do we know the limits of the quantum Tsallis entropy?
From the two main generalizations of the von Neumann entropy:
\begin{equation}
S(\rho)=-\operatorname{Tr}(\rho \log \rho)
\end{equation}
meaning Rényi:
\begin{equation}
R_{\alpha}(\rho)=\frac{1}{1-\alpha} \log…
jmstf94
- 73
- 3
2
votes
0 answers
Partially smoothed max-information and AEP - where's the flaw in my logic?
Sorry for the defintional overload but I promise there's an interesting question at the end! Please skip to the last section if you're familiar with one-shot information theory.
Defintions
The max-relative entropy between two states is defined…
user1936752
- 3,383
- 1
- 9
- 24
2
votes
2 answers
When can the max relative entropy be written as $D_{\max}(\rho\|\sigma) = \|\sigma^{-1/2}\rho\sigma^{-1/2}\|_{\infty}$?
The max-relative entropy between two states is defined as $D_{\max}(\rho\|\sigma) = \log\lambda$, where $\lambda$ is the smallest real number that satisfies $\rho\leq \lambda\sigma$, where $A\leq B$ is used to denote that $B-A$ is positive…
user1936752
- 3,383
- 1
- 9
- 24
2
votes
1 answer
Relating quantum max-relative entropy to classical maximum entropy
The quantum max-relative entropy between two states is defined as
$$D_{\max }(\rho \| \sigma):=\log \min \{\lambda: \rho \leq \lambda \sigma\},$$
where $\rho\leq \sigma$ should be read as $\sigma - \rho$ is positive semidefinite. In other words,…
develarist
- 995
- 7
- 15
2
votes
1 answer
What is the relationship between these two definitions for the max-entropy?
On Wikipedia, the max-entropy for classical systems is defined as
$$H_{0}(A)_{\rho}=\log \operatorname{rank}\left(\rho_{A}\right)$$
The term max-entropy in quantum information is reserved for the following definition
$$H_{\max }(A)_{\rho}=2 \cdot…
user1936752
- 3,383
- 1
- 9
- 24