comm.SphereDecoder
Decode input using sphere decoder
Description
The Sphere Decoder
System object™ decodes the symbols sent over N_{T} antennas
using the sphere decoding algorithm.
To decode input symbols using a sphere decoder:
Define and set up your sphere decoder object. See Construction.
Call
step
to decode input symbols according to the properties ofcomm.SphereDecoder
. The behavior ofstep
is specific to each object in the toolbox.
Note
Starting in R2016b, instead of using the step
method to
perform the operation defined by the System object, you can call the object with arguments, as if it were a function. For example,
y = step(obj,x)
and y = obj(x)
perform equivalent
operations.
Construction
H = comm.SphereDecoder
creates a System object, H
. This object uses the sphere decoding algorithm to find the
maximumlikelihood solution for a set of received symbols over a MIMO channel with
N_{T} transmit antennas and
N_{R} receive antennas.
H = comm.SphereDecoder(
creates a sphere decoder object, Name
,Value
)H
, with the specified property name set to
the specified value. Name must appear inside single quotes (''). You can specify several
namevalue pair arguments in any order as Name1,Value1,…,NameN,ValueN.
H = comm.SphereDecoder(
creates a sphere decoder object, CONSTELLATION
,BITTABLE
)H
, with the Constellation property set to
CONSTELLATION
, and the BitTable
property set to
BITTABLE
.
Properties

Signal constellation per transmit antenna Specify the constellation as a complex column vector containing the constellation points
to which the transmitted bits are mapped. The default setting is a QPSK constellation with an
average power of 

Bit mapping used for each constellation point. Specify the bit mapping for the symbols that the The matrix size must be 

Initial search radius of the decoding algorithm. Specify the initial search radius for the decoding algorithm as either
When you set this property to When you set this property to 

Specify the decoding decision method as either When you set this property to When you set this property to 
Methods
step  Decode received symbols using sphere decoding algorithm 
Common to All System Objects  

release  Allow System object property value changes 
Examples
Algorithm
This object implements a softoutput maxlog a posteriori probability (APP) MIMO detector by means of a softoutput SchnorrEuchner sphere decoder (SESD), implemented as single tree search (STS) tree traversal. The algorithm assumes the same constellation and bit table on all of the transmit antennas. Given as inputs, the received symbol vector and the estimated channel matrix, the algorithm outputs the loglikelihood ratios (LLRs) of the transmitted bits.
The algorithm assumes a MIMO system model with N_{T} transmit antennas and N_{R} receive antennas where N_{T} symbols are simultaneously sent, expressed as:
y = Hs + n.
where y is the received symbols, H is the MIMO channel matrix, s is the transmitted symbol vector, and n is the thermal noise.
The MIMO detector seeks the maximumlikelihood (ML) solution, $${\widehat{s}}_{ML}$$, such that:
$${\widehat{s}}_{{}_{ML}}=\underset{s\in o}{{\displaystyle {}_{\mathrm{arg}\mathrm{min}}}}{\Vert yHs\Vert}^{2}$$
where O is the complexvalued constellation from which the N_{T} elements of s are chosen.
Soft detection also computes a loglikelihood ratio (LLR) for each bit that serves as a measure of the reliability of the estimate for each bit. The LLR is calculated as using the maxlog approximation:
$$L\left({x}_{j,b}\right)=\underset{{\lambda}^{ML}}{\underbrace{\underset{s\in {x}_{j,b}^{(0)}}{\mathrm{min}}{\Vert yHs\Vert}^{2}}}\underset{{\lambda}_{j,b}^{\overline{ML}}}{\underbrace{\underset{s\in {x}_{j,b}^{(1)}}{\mathrm{min}}{\Vert yHs\Vert}^{2}}}$$
where
L(x_{j,b}) is the LLR estimate for each bit.
$${x}_{j,b}$$ is each sent bit, the bth bit of the jth symbol.
$${x}_{j,b}^{(0)}$$ and $${x}_{j,b}^{(1)}$$ are the disjoint sets of vector symbols that have the bth bit in the label of the jth scalar symbol equal to 0 and 1, respectively. The two λ symbols denotes the distance calculated as norm squared., specifically:
$${\lambda}^{ML}$$ is the distance $${\widehat{s}}_{ML}$$.
$${\lambda}_{j,b}^{{}^{\overline{ML}}}$$ is the distance to the counterhypothesis, which denotes the binary complement of the bth bit in the binary label of the jth entry of $${\widehat{s}}_{ML}$$, specifically the minimum of the symbol set $${x}_{j,b}^{\left({x}_{j,b}^{\overline{ML}}\right)}$$, which contains all of the possible vectors for which the bth bit of the jth entry is flipped compared to the same entry of $${\widehat{s}}_{ML}$$.
Based on whether $${x}_{j,b}^{\left({x}_{j,b}^{ML}\right)}$$ is 0
or 1
, the LLR estimate for bit $${x}_{j,b}$$ is computed as follows:
$$L({x}_{j,b})=\{\begin{array}{cc}{\lambda}^{ML}{\lambda}_{j,b}^{\overline{ML}},\text{\hspace{0.17em}}& {x}_{j,b}^{ML}=\text{\hspace{0.17em}}0\\ {\lambda}_{j,b}^{\overline{ML}}{\lambda}^{ML},& {x}_{j,b}^{ML}=\text{\hspace{0.17em}}1\end{array}$$
The design of a decoder strives to efficiently find $${\widehat{s}}_{ML}$$, $${\lambda}^{ML}$$, and $${\lambda}_{j,b}^{\overline{ML}}$$.
This search can be converted into a tree search by means of the sphere decoding algorithms. To this end, the channel matrix is decomposed into $$H=QR$$ by means of a QR decomposition. Leftmultiplying y by Q^{H}, the problem can be reformulated as:
$$\begin{array}{l}{\lambda}^{ML}=\underset{s\in o}{{\displaystyle \text{\hspace{0.17em}}\mathrm{arg}\text{\hspace{0.17em}}\mathrm{min}}}{\Vert \overline{y}Rs\Vert}^{2}\\ {\lambda}_{j,b}^{\overline{ML}}{=}_{s\in {x}_{j,b}^{\left({x}_{j,b}^{\overline{ML}}\right)}}^{\mathrm{arg}\mathrm{min}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{\Vert \overline{y}Rs\Vert}^{2}}\end{array}$$
Using this reformulated problem statement, the triangular structure of R can be exploited to arrange a tree structure such that each of the leaf nodes corresponds to a possible s vector and the partial distances to the nodes in the tree can be calculated cumulatively adding to the partial distance of the parent node.
In the STS algorithm, the $${\lambda}^{ML}$$ and $${\lambda}_{j,b}^{\overline{ML}}$$ metrics are searched concurrently. The goal is to have a list containing the metric $${\lambda}^{ML}$$, along with the corresponding bit sequence $${x}^{ML}$$ and the metrics $${x}_{j,b}^{\left({x}_{j,b}^{ML}\right)}$$ of all counterhypotheses. The subtree originating from a given node is searched only if the result can lead to an update of either $${\lambda}^{ML}$$ or $${\lambda}_{j,b}^{\overline{ML}}$$.
The STS algorithm flow can be summarized as:
If when reaching a leaf node, a new ML hypothesis is found $$\left(d\left(x\right)<{\lambda}^{ML}\right)$$, all $${\lambda}_{j,b}^{\overline{ML}}$$ for which $${x}_{j,b}={x}_{j,b}^{\overline{ML}}$$ are set to $${\lambda}^{ML}$$ which now turns into a valued counterhypothesis. Then, $${\lambda}^{ML}$$ is set to the current distance, d(x).
If $$d(x)\ge {\lambda}^{ML}$$, only the counterhypotheses have to be checked. For all j and b for which $$\left(d\left(x\right)<{\lambda}^{ML}\right)$$ and $${x}_{j,b}={x}_{j,b}^{\overline{ML}}$$, the decoder updates $${\lambda}_{j,b}^{\overline{ML}}$$ to be d(x).
A subtree is pruned if the partial distance of the node is bigger than the current $${\lambda}_{j,b}^{\overline{ML}}$$ which may be affected when traversing the subtree.
The STS concludes once all of the tree nodes have been visited once or pruned.
Limitations
The output LLR values are not scaled by the noise variance. For coded links employing iterative coding (LDPC or turbo) or MIMO OFDM with Viterbi decoding, the output LLR values should be scaled by the channel state information to achieve better performance.
Selected Bibliography
[1] Studer, C., A. Burg, and H. Bölcskei. “SoftOutput Sphere Decoding: Algorithms and VLSI Implementation”. IEEE Journal of Selected Areas in Communications. Vol. 26, No. 2, February 2008, pp. 290–300.
[2] Cho, Y. S., et.al. "MIMOOFDM Wireless communications with MATLAB," IEEE Press, 2011.
[3] Hochwald, B.M., S. ten Brink. “Achieving nearcapacity on a multipleantenna channel”, IEEE Transactions on Communications, Vol. 51, No. 3, Mar 2003, pp. 389399.
[4] Agrell, E., T. Eriksson, A. Vardy, K. Zeger. “Closest point search in lattices”, IEEE Transactions on Information Theory, Vol. 48, No. 8, Aug 2002, pp. 22012214.
Extended Capabilities
Version History
Introduced in R2013a