{"id":3783,"date":"2023-07-15T14:06:42","date_gmt":"2023-07-15T14:06:42","guid":{"rendered":"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/"},"modified":"2023-07-15T14:06:42","modified_gmt":"2023-07-15T14:06:42","slug":"kl-divergensi-di-sungai","status":"publish","type":"post","link":"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/","title":{"rendered":"Cara menghitung divergensi kl di r (dengan contoh)"},"content":{"rendered":"<p><\/p>\n<hr>\n<p><span style=\"color: #000000;\">Dalam statistik, <strong>divergensi Kullback\u2013Leibler (KL)<\/strong> adalah metrik jarak yang mengukur perbedaan antara dua distribusi probabilitas.<\/span><\/p>\n<p> <span style=\"color: #000000;\">Jika kita mempunyai dua distribusi probabilitas, P dan Q, biasanya kita menulis divergensi KL menggunakan notasi KL(P || Q), yang berarti &#8220;divergensi P dari Q&#8221;.<\/span><\/p>\n<p> <span style=\"color: #000000;\">Kami menghitungnya menggunakan rumus berikut:<\/span><\/p>\n<p> <span style=\"color: #000000;\">KL(P || Q) = \u03a3P(x) <em>ln<\/em> (P(x) \/ Q(x))<\/span><\/p>\n<p> <span style=\"color: #000000;\">Jika divergensi KL antara dua distribusi adalah nol, hal ini menunjukkan bahwa distribusi tersebut identik.<\/span><\/p>\n<p> <span style=\"color: #000000;\">Cara paling sederhana untuk menghitung divergensi KL antara dua distribusi probabilitas di R adalah dengan menggunakan fungsi <strong>KL()<\/strong> dari paket <strong>filentropi<\/strong> .<\/span><\/p>\n<p> <span style=\"color: #000000;\">Contoh berikut menunjukkan cara menggunakan fungsi ini dalam praktiknya.<\/span><\/p>\n<h2> <span style=\"color: #000000;\"><strong>Contoh: menghitung divergensi KL di R<\/strong><\/span><\/h2>\n<p> <span style=\"color: #000000;\">Misalkan kita memiliki dua distribusi probabilitas berikut di R:<\/span><\/p>\n<pre style=\"background-color: #ececec; font-size: 15px;\"> <strong><span style=\"color: #008080;\">#define two probability distributions\n<span style=\"color: #000000;\">P &lt;- c(.05, .1, .2, .05, .15, .25, .08, .12)\nQ &lt;- c(.3, .1, .2, .1, .1, .02, .08, .1)\n<\/span><\/span><\/strong><\/pre>\n<p> <span style=\"color: #000000;\"><strong>Catatan<\/strong> : Penting agar probabilitas setiap distribusi berjumlah satu.<\/span><\/p>\n<p> <span style=\"color: #000000;\">Kita dapat menggunakan kode berikut untuk menghitung divergensi KL antara kedua distribusi:<\/span><\/p>\n<pre style=\"background-color: #ececec; font-size: 15px;\"> <strong><span style=\"color: #008080;\"><span style=\"color: #000000;\"><span style=\"color: #008000;\">library<\/span> (philentropy)\n\n<span style=\"color: #008080;\">#rbind distributions into one matrix\n<\/span>x &lt;- rbind(P,Q)\n\n<span style=\"color: #008080;\">#calculate KL divergence\n<\/span>KL(x, unit=' <span style=\"color: #ff0000;\">log<\/span> ')\n\nMetric: 'kullback-leibler' using unit: 'log'; comparing: 2 vectors.\nkullback-leibler \n       0.5898852 \n<\/span><\/span><\/strong><\/pre>\n<p> <span style=\"color: #000000;\">Divergensi KL distribusi P dan distribusi Q kira-kira <strong>0,589<\/strong> .<\/span><\/p>\n<p> <span style=\"color: #000000;\">Perhatikan bahwa satuan yang digunakan dalam perhitungan ini dikenal sebagai <a href=\"https:\/\/en.wikipedia.org\/wiki\/Nat_(unit)\" target=\"_blank\" rel=\"noopener\">nats<\/a> , yang merupakan kependekan dari <em>natural unit of information<\/em> .<\/span><\/p>\n<p> <span style=\"color: #000000;\">Jadi kita dapat mengatakan bahwa divergensi KL adalah <strong>0,589 nats<\/strong> .<\/span><\/p>\n<p> <span style=\"color: #000000;\">Perhatikan juga bahwa divergensi KL bukanlah metrik simetris. Artinya jika kita menghitung divergensi KL distribusi Q dari distribusi P, kemungkinan besar kita akan mendapatkan nilai yang berbeda:<\/span><\/p>\n<pre style=\"background-color: #ececec; font-size: 15px;\"> <strong><span style=\"color: #008080;\"><span style=\"color: #000000;\"><span style=\"color: #008000;\">library<\/span> (philentropy)\n\n<span style=\"color: #008080;\">#rbind distributions into one matrix\n<\/span>x &lt;- rbind(Q,P)\n\n<span style=\"color: #008080;\">#calculate KL divergence\n<\/span>KL(x, unit=' <span style=\"color: #ff0000;\">log<\/span> ')\n\nMetric: 'kullback-leibler' using unit: 'log'; comparing: 2 vectors.\nkullback-leibler \n       0.4975493 \n<\/span><\/span><\/strong><\/pre>\n<p> <span style=\"color: #000000;\">Divergensi KL distribusi Q dan distribusi P kira-kira <strong>0,497 nats<\/strong> .<\/span><\/p>\n<p> <span style=\"color: #000000;\">Perhatikan juga bahwa beberapa rumus menggunakan log base-2 untuk menghitung divergensi KL. Dalam hal ini kita berbicara tentang divergensi dalam bentuk <a href=\"https:\/\/en.wikipedia.org\/wiki\/Bit\" target=\"_blank\" rel=\"noopener\">bit<\/a> , bukan nat.<\/span><\/p>\n<p> <span style=\"color: #000000;\">Untuk menghitung divergensi KL dalam bit, Anda dapat menggunakan log2 dalam argumen <strong>unit<\/strong> :<\/span><\/p>\n<pre style=\"background-color: #ececec; font-size: 15px;\"> <strong><span style=\"color: #008080;\"><span style=\"color: #000000;\"><span style=\"color: #008000;\">library<\/span> (philentropy)\n\n<span style=\"color: #008080;\">#rbind distributions into one matrix\n<\/span>x &lt;- rbind(P,Q)\n\n<span style=\"color: #008080;\">#calculate KL divergence (in bits)\n<\/span>KL(x, unit=' <span style=\"color: #ff0000;\">log2<\/span> ')\n\nMetric: 'kullback-leibler' using unit: 'log2'; comparing: 2 vectors.\nkullback-leibler \n       0.7178119\n<\/span><\/span><\/strong><\/pre>\n<p> <span style=\"color: #000000;\">Divergensi KL distribusi P dan distribusi Q kira-kira <strong>0,7178 bit<\/strong> .<\/span><\/p>\n<h2> <span style=\"color: #000000;\"><strong>Sumber daya tambahan<\/strong><\/span><\/h2>\n<p> <span style=\"color: #000000;\">Tutorial berikut menjelaskan cara melakukan tugas umum lainnya di R:<\/span><\/p>\n<p> <a href=\"https:\/\/statorials.org\/id\/menghasilkan-distribusi-normal-di-r\/\" target=\"_blank\" rel=\"noopener\">Cara menghasilkan distribusi normal di R<\/a><br \/> <a href=\"https:\/\/statorials.org\/id\/plot-distribusi-normal-r\/\" target=\"_blank\" rel=\"noopener\">Cara memplot distribusi normal di R<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Dalam statistik, divergensi Kullback\u2013Leibler (KL) adalah metrik jarak yang mengukur perbedaan antara dua distribusi probabilitas. Jika kita mempunyai dua distribusi probabilitas, P dan Q, biasanya kita menulis divergensi KL menggunakan notasi KL(P || Q), yang berarti &#8220;divergensi P dari Q&#8221;. Kami menghitungnya menggunakan rumus berikut: KL(P || Q) = \u03a3P(x) ln (P(x) \/ Q(x)) Jika [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[11],"tags":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v21.5 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Cara Menghitung Divergensi KL di R (dengan Contoh) - Statologi<\/title>\n<meta name=\"description\" content=\"Tutorial ini menjelaskan cara menghitung divergensi KL di R, dengan sebuah contoh.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/\" \/>\n<meta property=\"og:locale\" content=\"id_ID\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Cara Menghitung Divergensi KL di R (dengan Contoh) - Statologi\" \/>\n<meta property=\"og:description\" content=\"Tutorial ini menjelaskan cara menghitung divergensi KL di R, dengan sebuah contoh.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/\" \/>\n<meta property=\"og:site_name\" content=\"Statorials\" \/>\n<meta property=\"article:published_time\" content=\"2023-07-15T14:06:42+00:00\" \/>\n<meta name=\"author\" content=\"Benjamin anderson\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Ditulis oleh\" \/>\n\t<meta name=\"twitter:data1\" content=\"Benjamin anderson\" \/>\n\t<meta name=\"twitter:label2\" content=\"Estimasi waktu membaca\" \/>\n\t<meta name=\"twitter:data2\" content=\"2 menit\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/\",\"url\":\"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/\",\"name\":\"Cara Menghitung Divergensi KL di R (dengan Contoh) - Statologi\",\"isPartOf\":{\"@id\":\"https:\/\/statorials.org\/id\/#website\"},\"datePublished\":\"2023-07-15T14:06:42+00:00\",\"dateModified\":\"2023-07-15T14:06:42+00:00\",\"author\":{\"@id\":\"https:\/\/statorials.org\/id\/#\/schema\/person\/3d17a1160dd2d052b7c78e502cb9ec81\"},\"description\":\"Tutorial ini menjelaskan cara menghitung divergensi KL di R, dengan sebuah contoh.\",\"breadcrumb\":{\"@id\":\"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/#breadcrumb\"},\"inLanguage\":\"id\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/statorials.org\/id\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Cara menghitung divergensi kl di r (dengan contoh)\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/statorials.org\/id\/#website\",\"url\":\"https:\/\/statorials.org\/id\/\",\"name\":\"Statorials\",\"description\":\"Panduan anda untuk kompetensi statistik!\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/statorials.org\/id\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"id\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/statorials.org\/id\/#\/schema\/person\/3d17a1160dd2d052b7c78e502cb9ec81\",\"name\":\"Benjamin anderson\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"id\",\"@id\":\"https:\/\/statorials.org\/id\/#\/schema\/person\/image\/\",\"url\":\"http:\/\/statorials.org\/id\/wp-content\/uploads\/2023\/10\/Dr.-Benjamin-Anderson-96x96.jpg\",\"contentUrl\":\"http:\/\/statorials.org\/id\/wp-content\/uploads\/2023\/10\/Dr.-Benjamin-Anderson-96x96.jpg\",\"caption\":\"Benjamin anderson\"},\"description\":\"Halo, saya Benjamin, pensiunan profesor statistika yang menjadi guru Statorial yang berdedikasi. Dengan pengalaman dan keahlian yang luas di bidang statistika, saya ingin berbagi ilmu untuk memberdayakan mahasiswa melalui Statorials. Baca selengkapnya\",\"sameAs\":[\"http:\/\/statorials.org\/id\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Cara Menghitung Divergensi KL di R (dengan Contoh) - Statologi","description":"Tutorial ini menjelaskan cara menghitung divergensi KL di R, dengan sebuah contoh.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/","og_locale":"id_ID","og_type":"article","og_title":"Cara Menghitung Divergensi KL di R (dengan Contoh) - Statologi","og_description":"Tutorial ini menjelaskan cara menghitung divergensi KL di R, dengan sebuah contoh.","og_url":"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/","og_site_name":"Statorials","article_published_time":"2023-07-15T14:06:42+00:00","author":"Benjamin anderson","twitter_card":"summary_large_image","twitter_misc":{"Ditulis oleh":"Benjamin anderson","Estimasi waktu membaca":"2 menit"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/","url":"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/","name":"Cara Menghitung Divergensi KL di R (dengan Contoh) - Statologi","isPartOf":{"@id":"https:\/\/statorials.org\/id\/#website"},"datePublished":"2023-07-15T14:06:42+00:00","dateModified":"2023-07-15T14:06:42+00:00","author":{"@id":"https:\/\/statorials.org\/id\/#\/schema\/person\/3d17a1160dd2d052b7c78e502cb9ec81"},"description":"Tutorial ini menjelaskan cara menghitung divergensi KL di R, dengan sebuah contoh.","breadcrumb":{"@id":"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/#breadcrumb"},"inLanguage":"id","potentialAction":[{"@type":"ReadAction","target":["https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/statorials.org\/id\/kl-divergensi-di-sungai\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/statorials.org\/id\/"},{"@type":"ListItem","position":2,"name":"Cara menghitung divergensi kl di r (dengan contoh)"}]},{"@type":"WebSite","@id":"https:\/\/statorials.org\/id\/#website","url":"https:\/\/statorials.org\/id\/","name":"Statorials","description":"Panduan anda untuk kompetensi statistik!","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/statorials.org\/id\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"id"},{"@type":"Person","@id":"https:\/\/statorials.org\/id\/#\/schema\/person\/3d17a1160dd2d052b7c78e502cb9ec81","name":"Benjamin anderson","image":{"@type":"ImageObject","inLanguage":"id","@id":"https:\/\/statorials.org\/id\/#\/schema\/person\/image\/","url":"http:\/\/statorials.org\/id\/wp-content\/uploads\/2023\/10\/Dr.-Benjamin-Anderson-96x96.jpg","contentUrl":"http:\/\/statorials.org\/id\/wp-content\/uploads\/2023\/10\/Dr.-Benjamin-Anderson-96x96.jpg","caption":"Benjamin anderson"},"description":"Halo, saya Benjamin, pensiunan profesor statistika yang menjadi guru Statorial yang berdedikasi. Dengan pengalaman dan keahlian yang luas di bidang statistika, saya ingin berbagi ilmu untuk memberdayakan mahasiswa melalui Statorials. Baca selengkapnya","sameAs":["http:\/\/statorials.org\/id"]}]}},"yoast_meta":{"yoast_wpseo_title":"","yoast_wpseo_metadesc":"","yoast_wpseo_canonical":""},"_links":{"self":[{"href":"https:\/\/statorials.org\/id\/wp-json\/wp\/v2\/posts\/3783"}],"collection":[{"href":"https:\/\/statorials.org\/id\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/statorials.org\/id\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/statorials.org\/id\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/statorials.org\/id\/wp-json\/wp\/v2\/comments?post=3783"}],"version-history":[{"count":0,"href":"https:\/\/statorials.org\/id\/wp-json\/wp\/v2\/posts\/3783\/revisions"}],"wp:attachment":[{"href":"https:\/\/statorials.org\/id\/wp-json\/wp\/v2\/media?parent=3783"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/statorials.org\/id\/wp-json\/wp\/v2\/categories?post=3783"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/statorials.org\/id\/wp-json\/wp\/v2\/tags?post=3783"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}