Reading time: 10 min

12 /4/2019

Signal feature extraction using Short Time Fourier Transform, Singular Value Decomposition and Maximum Entropy

Interpreting signals is often a gruelling task, let alone trying to find a new or fancy way to do it, but that’s life in research so can’t complain there. In this post, I will demonstrate a novice approach to analysing and extracting meaningful features from multi component signals to either quantify or feed into a machine learning technique.

This approach comprises of three steps, each which can be replaced by a more advanced technique to extract better results, maybe…

Perform time frequency analysis on signal data

For this demonstration, I will utilise Short Time Fourier Transform (STFT) squared magnitude or spectrogram for time frequency representation which is denoted by:

Where x(t) is the signal under consideration, w(t) is a window function, tau is the time lag and X(tau,w) is the Fourier transform of the function x(t)w(t-tau).

SFTF is obtained by taking the Discrete Fourier Transform/ Fast Fourier Transform at successive time windows. STFT provides a linear representation of time-frequency that can result in poor frequency resolution if window lengths are too small, but it is suffice for this demo. There are more advanced time frequency techniques that can be employed here such as Wigner-Ville Distribution (WVD), Smoothed WVD, Choi-Williams Distribution et al.

Perform Singular Value Decomposition (SVD) on time frequency matrix to extract singular diagonal matrix

Singular value decomposition represents a method that transforms the original correlated variables into uncorrelated set of variables [1]. It is commonly used in signal processing to remove noise by zeroing the small singular values. In this context, SVD is utilised to reduce dimensionality of the time frequency matrix and represent it in a vector with a rank matrix consisting of largest to smallest contributions.

For a certain matrix S, the SVD is defined as follows:

Where Sigma is a diagonal singular matrix of the same size as S, and the values are sorted in decreasing order along the main diagonal. U and V are orthonormal matrices whose columns represent left and right singular vectors, respectively.

Perform entropy calculation on the singular value matrix

Entropy is the representation of future uncertainty by previous probability outcomes. In an event when received signal matches what is likely or highly probable, the entropy value is close to zero, whereas unpredictable signals will assume high entropy values with a maximum value corresponding to a 50% likelihood event.

If the average amount of information we receive per signal increases determined by Eq.2, it signifies an increase in uncertainty due to a shift in the underlying probability distribution from the previous state to the current state. In other words, the previous received signal contains a different distribution then the current received signal, indicative of a change in the medium, transmitter or receiver.

Where p_k is Sigma_i/Sigma_T

The Matlab code to perform this algorithmic approach is as follows:

function Sentropy = SSVD(signal, window, fs)
%enter your data inputs checks here
nfft = 2^nextpow2(length(signal));
[s, f, t, ps] = spectrogram(signal, window, [], nfft, fs, ‘psd’, ‘yaxis’);
sm = svd(ps);
sumsv = sum(sv);
pk = sv./sumsv;
Sentropy = -sum(pk.*log(pk));

It is quite straightforward to implement in Matlab or python. This approach extracts the most significant feature variables from a time frequency analysis that alone yield substantial amounts of signal information. It provides a robust way to perform signal change quantification using either entropy values itself, determining root mean squared errors or applying it to machine learning techniques.


Orovic, I., Stankovic, S., Draganic, A., Time-frequency analysis and singular value decomposition applied to the highly multicomponent musical signals, Faculty of Electrical Engineering, University of Montengero, Montenegro


Submit a form to get to touch