Глоссарий





Новости переводов

19 апреля, 2024

Translations in furniture production

07 февраля, 2024

Ghostwriting vs. Copywriting

30 января, 2024

Preparing a scientific article for publication in an electronic (online) journal

20 декабря, 2023

Translation and editing of drawings in CAD systems

10 декабря, 2023

About automatic speech recognition

30 ноября, 2023

Translation services for tunneling shields and tunnel construction technologies

22 ноября, 2023

Proofreading of English text



Глоссарии и словари бюро переводов Фларус

Поиск в глоссариях:  

Karelian

Language (script) codes


    Krl, английский
      See: knowledge representation language. kullback-liebler information measure the kullback-liebler information measure provides a single-number summary for comparing two distributions or models. the distance between a true distribution (or model) and some other distribution (or model) is defined to be the average difference between the log density of true and the other distribution, averaging over the true distribution. it can be derived as an information theoretic criteria and is often used for comparing various models for data. since the kullback-leibler distance requires knowledge of the true distribution, related measures, such as the akaike information criteria (aic) are often used instead. see also: akaike information criteria, entropy.


    Карельская автономная советская социалистическая республика, karelia [кэ`гыга, русский

    Карельский перешеек {между финским зал. и ладожским оз., ссср), русский



    Kum, английский

    Krl, английский
      See: knowledge representation language. kullback-liebler information measure the kullback-liebler information measure provides a single-number summary for comparing two distributions or models. the distance between a true distribution (or model) and some other distribution (or model) is defined to be the average difference between the log density of true and the other distribution, averaging over the true distribution. it can be derived as an information theoretic criteria and is often used for comparing various models for data. since the kullback-leibler distance requires knowledge of the true distribution, related measures, such as the akaike information criteria (aic) are often used instead. see also: akaike information criteria, entropy.