Difference between revisions of "Language/Multiple-languages/Vocabulary/Information-Theory"

From Polyglot Club WIKI
Jump to navigation Jump to search
m (Quick edit)
m (Quick edit)
 
Line 137: Line 137:
{{Sci-Tech-Index-Menu}}
{{Sci-Tech-Index-Menu}}


 
==Other Lessons==
==Related Lessons==
* [[Language/Multiple-languages/Vocabulary/Similar-Business-Entities|Similar Business Entities]]
* [[Language/Multiple-languages/Vocabulary/Similar-Business-Entities|Similar Business Entities]]
* [[Language/Multiple-languages/Vocabulary/Greetings-Welcome-greeting|Greetings Welcome greeting]]
* [[Language/Multiple-languages/Vocabulary/Greetings-Welcome-greeting|Greetings Welcome greeting]]
Line 152: Line 151:
* [[Language/Multiple-languages/Vocabulary/Longest-words|Longest words]]
* [[Language/Multiple-languages/Vocabulary/Longest-words|Longest words]]
* [[Language/Multiple-languages/Vocabulary/Count-from-1-to-10-in-many-languages|Count from 1 to 10 in many languages]]
* [[Language/Multiple-languages/Vocabulary/Count-from-1-to-10-in-many-languages|Count from 1 to 10 in many languages]]
<span links></span>

Latest revision as of 19:46, 27 March 2023

Hello polyglots, 😀

On this page you will find a part of the Sci–Tech Index, a project for science and technology learners.

cmn-Hans-CN: 科目 cmn-Latn.Pinyin-CN: kēmù deu-Latn-DE: Fach [n] eng-Latn-US: subject fra-Latn-FR: matière [f] jpn-Jpan-JP: 科目 jpn-Hrkt-JP: か↑もく rus-Cyrl-RU: предме́т
cmn-Hans-CN: 概念 cmn-Hans-CN: 前提 cmn-Latn.Pinyin-CN: gàiniàn cmn-Latn.Pinyin-CN: qiántí deu-Latn-DE: Begriff [m] deu-Latn-DE: Voraussetzung [f] eng-Latn-US: concept eng-Latn-US: prerequisite fra-Latn-FR: concept [m] fra-Latn-FR: préalable [m] jpn-Jpan-JP: 概念 jpn-Jpan-JP: 前提 jpn-Hrkt-JP: が↓いねん jpn-Hrkt-JP: ぜ↑んてい rus-Cyrl-RU: конце́пция rus-Cyrl-RU: предпосы́лка
conditional entropy support <topology>; entropy
differential entropy, continuous entropy probability density function <probability theory>; support <topology>; quantile function <probability theory>
entropy
information content, self-information, surprisal, Shannon information random variable <probability theory>; probability <probability theory>
joint entropy entropy; joint probability distribution <probability theory>
limiting density of discrete points differential entropy
mutual information independence <probability theory>; information content
relative entropy, Kullback–Leibler divergence statistical distance <probability theory>

156px-Binary_erasure_channel.svg.png

Unmaintained! Moved to a Codeberg repository. Use Foam with VSCodium to visualise the content.

Other Lessons[edit | edit source]