kok202
KLDivergence

2019. 1. 24. 22:08[정리] 직무별 개념 정리/딥러닝

import numpy as np

import scipy.stats as stats


def KLD(pk, qk):

    kld = stats.entropy(pk, qk)

    print(kld)


# p

pk = np.random.normal(0.1, 0.01, 100000)

qk = np.random.normal(0.1, 0.01, 100000)

KLD(pk, qk)


pk = np.random.normal(0.1, 0.001, 100000)

qk = np.random.normal(0.1, 0.001, 100000)

KLD(pk, qk)






출처 : https://m.blog.naver.com/PostView.nhn?blogId=atelierjpro&logNo=220981354861&proxyReferer=&proxyReferer=https%3A%2F%2Fwww.google.com%2F

'[정리] 직무별 개념 정리 > 딥러닝' 카테고리의 다른 글

CNN Calculater  (0) 2019.01.24
pyTorch - Mnist VAE  (0) 2019.01.24
Tensorflow - Mnist CNN 분류  (0) 2019.01.24