KLDivergence
2019. 1. 24. 22:08ㆍ[정리] 직무별 개념 정리/딥러닝
import numpy as np
import scipy.stats as stats
def KLD(pk, qk):
kld = stats.entropy(pk, qk)
print(kld)
# p
pk = np.random.normal(0.1, 0.01, 100000)
qk = np.random.normal(0.1, 0.01, 100000)
KLD(pk, qk)
pk = np.random.normal(0.1, 0.001, 100000)
qk = np.random.normal(0.1, 0.001, 100000)
KLD(pk, qk)
'[정리] 직무별 개념 정리 > 딥러닝' 카테고리의 다른 글
CNN Calculater (0) | 2019.01.24 |
---|---|
pyTorch - Mnist VAE (0) | 2019.01.24 |
Tensorflow - Mnist CNN 분류 (0) | 2019.01.24 |