In AWGN channel, convolutional coding, viterbe decoding and soft hard decision are used to analyze the bit error rate

1, Convolutional coding simulation (comparison of BER and theoretical value of hard decision decoding)
Known convolutional code generation function:

It can be seen that the code rate is 1 / 3, and the corresponding tap coefficient is (557663711) convolutional code. Write the program, use the conditions of convolutional coding, Viterbi decoding (hard decision), BPSK modulation and demodulation, Gaussian white noise, and simulate the relationship between the error rate and the change of channel SNR, as shown in the following figure:

The red curve is the theoretical value and the blue curve is the actual value. It can be seen that the actual bit error rate value is slightly higher than the theoretical bit error rate value, which is in line with the reality. This shows that it is necessary to improve the decoding algorithm so that the error rate is closer to the theoretical value.

2, Comparison of soft and hard decision error rate in convolutional coding
On the basis of the first step, under the same conditions, the soft decision rule is used to compare the error rate improvement, and the results are as follows:

The red curve is the simulation curve of bit error rate based on hard decision. The blue curve is the simulation curve of BER based on soft decision. It can be seen from the figure that the performance of soft decision is 1.8dB higher than that of hard decision, which is consistent with the theory.

3, Related code

####Convolutional code program

close all;
clear all;

%% Initial parameter setting
EbN0=-2:0.5:10;%SNR range setting
L=200000;%Set the number of symbols
data=randi([0,1],1,L);%Information symbol generation

A = 5; %The coefficient of the highest term of convolutional code generating function
bf = 18;

%% channel coding 

len = 9; % Constraint length, total to D8,Equivalent to9Registers
treliss=poly2trellis(len,[557 663 711]);%tap coefficient (101101111,110110011,111001001)
encode= convenc(data,treliss);% channel coding 

%% Channel plus noise calculation

for S0=-2:0.5:10%Start the test
    % Calculate theoretical BER value
    % Information sequence modulation-Add noise-demodulation

    % Hard decision decoding
    decoded1 = vitdec(codes_bsc,treliss,len,'trunc','hard'); 
    Error_Bit_Hard(j)=1-length(find(data==decoded1))/L; %Bit error rate statistics
    %soft-decision decoding 
    [x,qcode]=quantiz(-codes,[-0.875 -0.75,-0.625 -0.5,-0.375 -0.25,-0.125 0,0.125 0.25,0.375 0.5,0.625 0.75 0.875],15:-1:0);%Capture symbol soft decision quantization
    Error_Bit_Soft(j)=1-length(find(data==decoded2))/L;%Bit error rate statistics


%% Performance improvement of drawing comparison
semilogy(EbN0,Error_Bit_Hard,'-ob','linewidth',1.5);hold on;
legend('CC(557 663 711)','CC(557 663 711)-Theory');
grid on;
title('The Relationship Between BER and Eb/N0');

semilogy(EbN0,Error_Bit_Hard,'-*r','linewidth',1.5);hold on
legend('CC(557 663 711)-Hard decoded','CC(557 663 711)-Soft decoded');
grid on;
title('The Relationship Between BER and Eb/N0');

############Channel plus noise program

function [Msg_BSC,Msg]=BPSK_AddNoise(msg,snr)
% modulation
msg1=2*msg-1;   %constellation mapping 0Map to-1,1Map to1

% Add noise
noise=0+(sigma)^0.5.*randn(1,length(msg)); %Add Gaussian white noise, sigma Is SNR variance
Msg=msg1+noise; %Information sequence with noise

% After demodulating and adding noise, the information sequence needs to be restored to0-1Bit sequence
for i=1:length(msg);
    if Msg(i)>0

If you have any questions, you can contact the blogger, discuss and make progress together!

Posted on Fri, 19 Jun 2020 05:21:17 -0400 by drcphd