Глоссарий





Новости переводов

19 апреля, 2024

Translations in furniture production

07 февраля, 2024

Ghostwriting vs. Copywriting

30 января, 2024

Preparing a scientific article for publication in an electronic (online) journal

20 декабря, 2023

Translation and editing of drawings in CAD systems

10 декабря, 2023

About automatic speech recognition

30 ноября, 2023

Translation services for tunneling shields and tunnel construction technologies

22 ноября, 2023

Proofreading of English text



Глоссарии и словари бюро переводов Фларус

Поиск в глоссариях:  

Law of large numbers

Glossary of Statistical Terms
  1. The law of large numbers says that in repeated, independent trials with the same probability p of success in each trial, the percentage of successes is increasingly likely to be close to the chance of success as the number of trials increases. more precisely, the chance that the percentage of successes differs from the probability p by more than a fixed positive amount, e > 0, converges to zero as the number of trials n goes to infinity, for every number e > 0. note that in contrast to the difference between the percentage of successes and the probability of success, the difference between the number of successes and the expected number of successes, n×p, tends to grow as n grows. the following tool illustrates the law of large numbers; the button toggles between displaying the difference between the number of successes and the expected number of successes, and the difference between the percentage of successes and the expected percentage of successes. the tool on this page illustrates the law of large numbers.

  2. Закон больших чисел (устанавливаемые закономерности выражаются тем полнее и точнее, чем большая совокупность массовых явленийг фактов охватывается при этом; по мере уменьшенич числа наблюдаемых однородных фактов возрастают расхождения между статистическим законом и эмпирически наблюдаемыми явлениями)


Закон больших чисел, русский
    Сформулированный якобом бернулли (1654–1705) закон, который гласит, что точность среднего значения выборки увеличивается (или стандартная ошибка статистики уменьшается) с ростом количества единиц в выборке. чем больше выборка, тем с большей вероятностью ее можно рассматривать в качестве «универсума» (генеральной совокупности). закон достоверен только для несмещенных выборок.




Independent, английский
  1. One who logs and sells his output on the open market; not associated with a mill or under company or dealer contract (19).

  2. Независимый; автономный

  3. Независимый, автономный

  4. Независимый, самостоятельный; рантье (лицо, живущее на доходы от капитала)

  5. Независимый

  6. A независимый

  7. A merchant ship under naval control, but sailing alone and unescorted by any warship.


Probability, английский
  1. Probability can be generally defined as a measure of how likely some event will occur. the event could be an explosion, a lottery win or perhaps cancer induction. mathematically speaking, the value of probability varies between 0 and 1 where 0 means an ev

  2. Вероятность. возможность.

  3. The probability of an event is a number between zero and 100%. the meaning (interpretation) of probability is the subject of theories of probability, which differ in their interpretations. however, any rule for assigning probabilities to events has to satisfy the axioms of probability.

  4. Вероятность

  5. Вероятность probability sample, syn. random sample

  6. Вероятность. мера случайности появления конкретного события. например, вероятность случайного выбора из популяции человека с конкретным качеством соответствует доле людей в популяции, обладающих этим качеством.

  7. Вероятность; возможность

  8. Вероятность; обеспеченность (гидрологической величины) ~ of no-failure вероятность безотказной работы

  9. Probability is a method for representing uncertainty about propositions or events. it represents the uncertainty about a proposition on a scale from 0 to 1, with a 0 representing complete certainty that the proposition is false or an event will not occur and a value of one will represent the opposite. formally, a probability measure is one that follows kolmogorov`s axioms. there are two main schools of thought on the meaning of probability. frequentists take a narrow interpretation of probability allowing only hypothetically repeatable events or experiments as being quantifiable by probability, while bayesians take a broader interpretation that allows reasoning about "one-shot" events and propositions based on the current knowledge about nature. the bayesian interpretation is most commonly used in artificial intelligence, while the frequentist interpretation is most commonly taught in statistics courses. the label "bayesian" arises from the central role that the bayes theorem plays in this use of probability. it allows one to reason from effects to causes and encourages the use of probability measures to describe supposedly fixed events or propositions which frequentists disallow. the probability for these events reflects one`s state of knowledge about the event, rather than being an assertion that the unknown event can vary. for example, a bayesian would have no qualms about making statements about the probability that a given die, rolled and hidden from his sight is, for example, a six. a frequentist would be unable to make such a statement, preferring to talk about his confidence in the method when applied to a hypothetically large number of repeated experiments. in the end, they would act in similar ways. when the long run data are available, bayesians and frequentists end up with the same estimates. see also: bayes theorem, kolmogorov`s axioms.

  10. Вероятность. математическое измерение возможности появления неко-его события, выраженное в виде дроби или процента [30]. значения статистической вероятности лежат в пределах от 1 или 100 процентов (всегда) до 0 или 0 процентов (никогда) [20]. наибольшее приближение к истинной вероятности дает относитель-ная частота события, полученная на основе большой серии измерений или результа-тов [33]. вероятность может быть также определена как выражение в некоторой неопределимой форме "степени уверенности" или как предельная частота события в бесконечной случайной последовательности [49].

  11. The likelihood of something happening. for example, sale being made.

  12. Вероятность. математическое измерение возможности появления некоего события, выраженное в виде дроби или процента [30]. значения статистической вероятности лежат в пределах от 1 или 100 процентов (всегда) до 0 или 0 процентов (никогда) [20]. наибольшее приближение к истинной вероятности дает относительная частота события, полученная на основе большой серии измерений или результатов [33]. вероятность может быть также определена как выражение в некоторой неопределимой форме "степени уверенности" или как предельная частота события в бесконечной случайной последовательности [49].

  13. The relative likelihood of a particular outcome among all possible outcomes.

  14. Likelihood that an event may occur, expressed as a number between 0 and 1.


Percentage, английский
  1. Pourcentage

  2. Процент

  3. The proportion rate in every hundred or for every hundred  what is the percentage of long-stay patients in the hospital?

  4. Процент; процентное содержание; процентный состав; процентное отношение percentage-of-completion method метод "по мере готовности"

  5. Процентное значение; значение величины, выраженное в процентах; процентное содержание о ~ by

  6. Комиссионные (выраженные в процентах)


Difference, английский
  1. Disagreement, inequity, contrast, dissimilarity, incompatibility

  2. Разница; разность

  3. An important army term, meaning firstly the sum to be paid by officers when exchanging from the half to full pay; and, secondly, the price or difference in value of the several commissions.


Displaying, английский
    Отображение; вывод на экран; показ


Marginal probability distribution, английский
    The marginal probability distribution of a random variable that has a joint probability distribution with some other random variables is the probability distribution of that random variable without regard for the values that the other random variables take. the marginal distribution of a discrete random variable x1 that has a joint distribution with other discrete random variables can be found from the joint distribution by summing over all possible values of the other variables. for example, suppose we roll two fair dice independently. let x1 be the number of spots that show on the first die, and let x2 be the total number of spots that show on both dice. then the joint distribution of x1 and x2 is as follows:


Law of averages, английский
    The law of averages says that the average of independent observations of random variables that have the same probability distribution is increasingly likely to be close to the expected value of the random variables as the number of observations grows. more precisely, if x1, x2, x3, …, are independent random variables with the same probability distribution, and e(x) is their common expected value, then for every number ε > 0,