uncertainty

How to shade under curve in matplotlib, but with variable color alpha?

╄→尐↘猪︶ㄣ 提交于 2019-12-10 12:08:07
问题 I have some data points which are a function of one variable. I'd like to plot these, but there's associated uncertainty in each datum. Error bars would be OK, but I'd like to be able to visualize the way we expect the error to be distributed as well. For instance, a gaussian distribution with known width could be given. I'm hoping that the alpha value of fill_between could be set according to the probability distribution, resulting in a plot like in this question about filling under a curve,

AI: Partial Unification in Open-World Reference Resolution

做~自己de王妃 提交于 2019-12-10 10:59:22
问题 When performing reference resolution on predicates describing the semantics of dialogue expressions, I need to be able to allow for partial unification due to working in an open world. For example, consider the following scenario: There is a blue box in front of you. We refer to this blue box using the id 3 . A set of predicates box(x)^blue(x) can easily resolve to the blue box you know about. Making this query will return 3 A set of predicates ball(x)^yellow(x) will not resolve to anything.

A good uncertainty (interval) arithmetic library? [closed]

谁都会走 提交于 2019-12-09 12:02:47
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 5 months ago . edited Given that the words "uncertain" and "uncertainty" are fairly ubiquitous, it's hard to Google "uncertainty arithmetic" and get anything immediately helpful. Thus, can anyone suggest a good library of routines, in almost any programming/scripting language, that implements handling of uncertain values, as

zero division error in python uncertainties package

瘦欲@ 提交于 2019-12-07 14:29:37
问题 Why does the following zero division error occur? >>> from uncertainties import ufloat >>> a = ufloat((0,0)) >>> x = ufloat((0.3,0.017)) >>> a**x Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/uncertainties/__init__.py", line 601, in f_with_affine_output if arg.derivatives File "<string>", line 1, in <lambda> ZeroDivisionError: 0.0 cannot be raised to a negative power >>> 0.0*

AI: Partial Unification in Open-World Reference Resolution

那年仲夏 提交于 2019-12-06 06:40:19
When performing reference resolution on predicates describing the semantics of dialogue expressions, I need to be able to allow for partial unification due to working in an open world. For example, consider the following scenario: There is a blue box in front of you. We refer to this blue box using the id 3 . A set of predicates box(x)^blue(x) can easily resolve to the blue box you know about. Making this query will return 3 A set of predicates ball(x)^yellow(x) will not resolve to anything. This is fine. But now consider ball(x)^yellow(x)^box(y)^blue(y)^behind(x,y) that is, the yellow ball

zero division error in python uncertainties package

拥有回忆 提交于 2019-12-06 01:40:03
Why does the following zero division error occur? >>> from uncertainties import ufloat >>> a = ufloat((0,0)) >>> x = ufloat((0.3,0.017)) >>> a**x Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/uncertainties/__init__.py", line 601, in f_with_affine_output if arg.derivatives File "<string>", line 1, in <lambda> ZeroDivisionError: 0.0 cannot be raised to a negative power >>> 0.0**x Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/opt/local/Library

How to determine the uncertainty of fit parameters with Python?

懵懂的女人 提交于 2019-12-03 17:11:14
问题 I have the following data for x and y: x y 1.71 0.0 1.76 5.0 1.81 10.0 1.86 15.0 1.93 20.0 2.01 25.0 2.09 30.0 2.20 35.0 2.32 40.0 2.47 45.0 2.65 50.0 2.87 55.0 3.16 60.0 3.53 65.0 4.02 70.0 4.69 75.0 5.64 80.0 7.07 85.0 9.35 90.0 13.34 95.0 21.43 100.0 For the above data, I am trying to fit the data in the form: However, there are certain uncertainties associated with x and y, where x has uncertainty of 50% of x and y has a fixed uncertainty. I am trying to determine the uncertainty in the

Plotting shaded uncertainty region in line plot in matplotlib when data has NaNs

我们两清 提交于 2019-12-03 12:45:00
问题 I would like a plot which looks like this: I am trying to do this with matplotlib: fig, ax = plt.subplots() with sns.axes_style("darkgrid"): for i in range(5): ax.plot(means.ix[i][list(range(3,104))], label=means.ix[i]["label"]) ax.fill_between(means.ix[i][list(range(3,104))]-stds.ix[i][list(range(3,104))], means.ix[i][list(range(3,104))]+stds.ix[i][list(range(3,104))]) ax.legend() I want the shaded region to be the same colour as the line in the centre. But right now, my problem is that

How to determine the uncertainty of fit parameters with Python?

别等时光非礼了梦想. 提交于 2019-12-03 06:02:24
I have the following data for x and y: x y 1.71 0.0 1.76 5.0 1.81 10.0 1.86 15.0 1.93 20.0 2.01 25.0 2.09 30.0 2.20 35.0 2.32 40.0 2.47 45.0 2.65 50.0 2.87 55.0 3.16 60.0 3.53 65.0 4.02 70.0 4.69 75.0 5.64 80.0 7.07 85.0 9.35 90.0 13.34 95.0 21.43 100.0 For the above data, I am trying to fit the data in the form: However, there are certain uncertainties associated with x and y, where x has uncertainty of 50% of x and y has a fixed uncertainty. I am trying to determine the uncertainty in the fit parameters with this uncertainties package . But, I am having issues with curve fitting with scipy

How to calculate prediction uncertainty using Keras?

两盒软妹~` 提交于 2019-12-02 18:27:34
I would like to calculate NN model certainty / confidence (see What my deep model doesn't know ) - when NN tells me an image represents "8", I would like to know how certain it is. Is my model 99% certain it is "8" or is it 51% it is "8", but it could also be "6"? Some digits are quite ambigious and I would like to know for which images the model is just "flipping a coin". I have found some theoretical writings about this but I have trouble putting this in code. If I understand correctly, I should evaluate a testing image multiple times while "killing off" different neurons (using dropout) and