86 Entropy

Name as Shannon Entropy in Tradingview. Entropy measures the unpredictability of the data. A die has higher entropy (p=1/6) versus a coin (p=1/2). This is a measure of “surprise” in the data, the larger the move or deviation from the most probable value, the higher the new information gain.

It is measure as:

P = close / SUM(close, length)

E = SUM(-P * npLog(P) / npLog(base), length)

Tradingview functions allows to include volume.

Shannon Entropy

Figure 10.1: Shannon Entropy