Quiet
  • HOME
  • ARCHIVE
  • CATEGORIES
  • TAGS
  • LINKS
  • ABOUT

Alex

  • HOME
  • ARCHIVE
  • CATEGORIES
  • TAGS
  • LINKS
  • ABOUT
Quiet主题
  • 算法
  • 模型

Entropy Information

Alex.Y
Science

2025-06-06 15:09:00

Entropy Information

熵: 系统从有序到无序的自然演化规律(熵增原理)

  • 信息熵(香农熵): 信息不确定性的度量
  • 热力学熵(克劳修斯熵): 系统无序程度的度量

熵增经典示例

1. 墨水滴入清水

import matplotlib.pyplot as plt
import numpy as np
from matplotlib.animation import FuncAnimation
from matplotlib import cm

# 模拟墨水滴入清水的熵增过程
fig, ax = plt.subplots()
x = np.linspace(0, 10, 100)
y = np.linspace(0, 10, 100)
X, Y = np.meshgrid(x, y)

# 初始状态:有序(墨水聚集)
Z = np.exp(-((X-5)**2 + (Y-5)**2))

# 扩散过程(熵增加)
def animate(frame):
ax.clear()
ax.set_title(f"熵增过程: 时间步={frame}")
Z = np.exp(-((X-5)**2 + (Y-5)**2)/(1+frame/10)) # 扩散方程
ax.contourf(X, Y, Z, cmap=cm.Blues)
ax.set_xlabel("熵增加 → 无序度增加")
return ax

ani = FuncAnimation(fig, animate, frames=100, interval=100)
plt.show()

物理过程:

  • t=0:墨水分子高度有序(低熵)
  • t>0:分子随机运动扩散
  • t→∞:均匀分布(最大熵)

2. 房间整理与混乱

  • 有序状态:书籍按分类排列,衣物叠放整齐(低熵)
  • 自然演化:使用后书籍乱放,衣物随意堆放(熵增)
  • 最终状态:房间混乱(高熵)

3. A cup or Water VS A cup of Ice

![image-20250606153449363](UDP_Protocol (copy)/image-20250606153449363.png)![image-20250606153538332](UDP_Protocol (copy)/image-20250606153538332.png)

In solo system, Entropy is a direct measure of each energy configuration’s probability. Low entropy means the energy is concentrated, high entropy means the energy is spread out.

4. 信息熵

熵是一种信息度量,体现了在系统当前状态下,个体组合方式的多样性程度。

![image-20250606154913225](UDP_Protocol (copy)/image-20250606154913225.png)

![image-20250606161832493](UDP_Protocol (copy)/image-20250606161832493.png)

import matplotlib.pyplot as plt
import numpy as np
from matplotlib.animation import FuncAnimation
from matplotlib import cm

# 概率分布
probabilities = np.array([0.2, 0.6, 0.2]) # 假设三个事件的概率 1.3710

# probabilities = np.array([0.33, 0.33, 0.33]) # 假设三个事件的概率 1.5835

# 计算熵
def calculate_entropy(probabilities):
return -np.sum([p * np.log2(p) for p in probabilities if p > 0])
entropy = calculate_entropy(probabilities)
print(f"The entropy of the distribution is: {entropy:.4f} bits")
上一篇

FinBert NLP Cosin

下一篇

RAG Knowledge

©2026 By Alex. 主题:Quiet
Quiet主题