*Kenric P. Nelson*

- Published in print:
- 2020
- Published Online:
- December 2020
- ISBN:
- 9780190636685
- eISBN:
- 9780190636722
- Item type:
- chapter

- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780190636685.003.0012
- Subject:
- Economics and Finance, Microeconomics

This chapter introduces a simple, intuitive approach to the assessment of probabilistic inferences. The Shannon information metrics are translated to the probability domain. The translation shows ...
More

This chapter introduces a simple, intuitive approach to the assessment of probabilistic inferences. The Shannon information metrics are translated to the probability domain. The translation shows that the negative logarithmic score and the geometric mean are equivalent measures of the accuracy of a probabilistic inference. The geometric mean of forecasted probabilities is thus a measure of forecast accuracy and represents the central tendency of the forecasts. The reciprocal of the geometric mean is referred to as the perplexity and defines the number of independent choices needed to resolve the uncertainty. The assessment method introduced in this chapter is intended to reduce the ‘qualitative’ perplexity relative to the potpourri of scoring rules currently used to evaluate machine learning and other probabilistic algorithms. Utilization of this assessment will provide insight into designing algorithms with reduced the ‘quantitative’ perplexity and thus improved the accuracy of probabilistic forecasts. The translation of information metrics to the probability domain is incorporating the generalized entropy functions developed Rényi and Tsallis. Both generalizations translate to the weighted generalized mean. The generalized mean of probabilistic forecasts forms a spectrum of performance metrics referred to as a Risk Profile. The arithmetic mean is used to measure the decisiveness, while the –2/3 mean is used to measure the robustness.Less

This chapter introduces a simple, intuitive approach to the assessment of probabilistic inferences. The Shannon information metrics are translated to the probability domain. The translation shows that the negative logarithmic score and the geometric mean are equivalent measures of the accuracy of a probabilistic inference. The geometric mean of forecasted probabilities is thus a measure of forecast accuracy and represents the central tendency of the forecasts. The reciprocal of the geometric mean is referred to as the perplexity and defines the number of independent choices needed to resolve the uncertainty. The assessment method introduced in this chapter is intended to reduce the ‘qualitative’ perplexity relative to the potpourri of scoring rules currently used to evaluate machine learning and other probabilistic algorithms. Utilization of this assessment will provide insight into designing algorithms with reduced the ‘quantitative’ perplexity and thus improved the accuracy of probabilistic forecasts. The translation of information metrics to the probability domain is incorporating the generalized entropy functions developed Rényi and Tsallis. Both generalizations translate to the weighted generalized mean. The generalized mean of probabilistic forecasts forms a spectrum of performance metrics referred to as a Risk Profile. The arithmetic mean is used to measure the decisiveness, while the –2/3 mean is used to measure the robustness.

*Richard Pettigrew*

- Published in print:
- 2016
- Published Online:
- May 2016
- ISBN:
- 9780198732716
- eISBN:
- 9780191797019
- Item type:
- book

- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198732716.001.0001
- Subject:
- Philosophy, Metaphysics/Epistemology, Philosophy of Science

This book explores a particular way of justifying the rational principles that govern credences (or degrees of belief). The main principles that the book justifies are the central tenets of Bayesian ...
More

This book explores a particular way of justifying the rational principles that govern credences (or degrees of belief). The main principles that the book justifies are the central tenets of Bayesian epistemology, though many other related principles are met along the way. These are: Probabilism, the claims that credences should obey the laws of probability; the Principal Principle, which says how credences in hypotheses about the objective chances should relate to credences in other propositions; the Principle of Indifference, which says that, in the absence of evidence, credences should be distributed equally over all possibilities that are entertained; and Conditionalization, the Bayesian account of how responses are planned when new evidence is received. Ultimately, then, the book is a study in the foundations of Bayesianism. To justify these principles, the book looks to decision theory. An agent’s credences are treated as if they were a choice she makes. The book appeals to the principles of decision theory to show that, when epistemic utility is measured in this way, the credences that violate the principles listed above are ruled out as irrational. The account of epistemic utility given is the veritist’s: the sole fundamental source of epistemic utility for credences is their accuracy. Thus, this is an investigation of the version of epistemic utility theory known as accuracy-first epistemology.Less

This book explores a particular way of justifying the rational principles that govern credences (or degrees of belief). The main principles that the book justifies are the central tenets of Bayesian epistemology, though many other related principles are met along the way. These are: Probabilism, the claims that credences should obey the laws of probability; the Principal Principle, which says how credences in hypotheses about the objective chances should relate to credences in other propositions; the Principle of Indifference, which says that, in the absence of evidence, credences should be distributed equally over all possibilities that are entertained; and Conditionalization, the Bayesian account of how responses are planned when new evidence is received. Ultimately, then, the book is a study in the foundations of Bayesianism. To justify these principles, the book looks to decision theory. An agent’s credences are treated as if they were a choice she makes. The book appeals to the principles of decision theory to show that, when epistemic utility is measured in this way, the credences that violate the principles listed above are ruled out as irrational. The account of epistemic utility given is the veritist’s: the sole fundamental source of epistemic utility for credences is their accuracy. Thus, this is an investigation of the version of *epistemic utility theory* known as *accuracy-first epistemology*.

*Richard Pettigrew*

- Published in print:
- 2016
- Published Online:
- May 2016
- ISBN:
- 9780198732716
- eISBN:
- 9780191797019
- Item type:
- chapter

- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198732716.003.0004
- Subject:
- Philosophy, Metaphysics/Epistemology, Philosophy of Science

This chapter surveys the existing attempts to characterize the legitimate inaccuracy measures. It considers characterizations due to James M. Joyce as well as the characterization offered by Hannes ...
More

This chapter surveys the existing attempts to characterize the legitimate inaccuracy measures. It considers characterizations due to James M. Joyce as well as the characterization offered by Hannes Leitgeb and Richard Pettigrew. In each case, some of the assumptions required for the characterization are seen to be unjustified.Less

This chapter surveys the existing attempts to characterize the legitimate inaccuracy measures. It considers characterizations due to James M. Joyce as well as the characterization offered by Hannes Leitgeb and Richard Pettigrew. In each case, some of the assumptions required for the characterization are seen to be unjustified.

*Richard Pettigrew*

- Published in print:
- 2016
- Published Online:
- May 2016
- ISBN:
- 9780198732716
- eISBN:
- 9780191797019
- Item type:
- chapter

- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198732716.003.0005
- Subject:
- Philosophy, Metaphysics/Epistemology, Philosophy of Science

This chapter presents the preferred characterization of the legitimate inaccuracy measures. It begins by characterizing all of the strictly proper inaccuracy measures. Later, a further condition is ...
More

This chapter presents the preferred characterization of the legitimate inaccuracy measures. It begins by characterizing all of the strictly proper inaccuracy measures. Later, a further condition is considered that narrows the field to a single inaccuracy measure, namely, the popular Brier score.Less

This chapter presents the preferred characterization of the legitimate inaccuracy measures. It begins by characterizing all of the strictly proper inaccuracy measures. Later, a further condition is considered that narrows the field to a single inaccuracy measure, namely, the popular Brier score.