Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Numpy deprecation test suite failure on Ubuntu 16.04 #15

Closed
fsaad opened this issue Nov 6, 2017 · 4 comments
Closed

Numpy deprecation test suite failure on Ubuntu 16.04 #15

fsaad opened this issue Nov 6, 2017 · 4 comments

Comments

@fsaad
Copy link

fsaad commented Nov 6, 2017

Using make all results in two failures in the distributions test suite:

======================================================================
ERROR: distributions.tests.test_util.test_scores_to_probs
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/venv/local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
    self.test(*self.arg)
  File "/posterior/distributions/distributions/tests/test_util.py", line 38, in test_scores_to_probs
    probs = scores_to_probs(scores)
  File "/posterior/distributions/distributions/util.py", line 34, in scores_to_probs
    probs = numpy.exp(scores, out=scores)
TypeError: ufunc 'exp' output (typecode 'd') could not be coerced to provided output parameter (typecode 'l') according to the casting rule ''same_kind''

======================================================================
FAIL: distributions.tests.test_models.test_add_remove('lp.models.niw',)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/venv/local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
    self.test(*self.arg)
  File "/posterior/distributions/distributions/tests/test_models.py", line 104, in test_one_model
    test_fun(module, EXAMPLE)
  File "/posterior/distributions/distributions/tests/test_models.py", line 246, in test_add_remove
    err_msg='p(x1,...,xn) != p(x1) p(x2|x1) p(xn|...)')
  File "/posterior/distributions/distributions/tests/util.py", line 120, in assert_close
    assert_less(diff, tol * norm, msg)
AssertionError: p(x1,...,xn) != p(x1) p(x2|x1) p(xn|...) off by 0.163564845877% = 0.00525343418121
-------------------- >> begin captured stdout << ---------------------
example 1/4
example 2/4
example 3/4
p(x1,...,xn) != p(x1) p(x2|x1) p(xn|...)
actual = -1.10329115391
expected = -1.10854458809

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
Ran 292 tests in 55.957s

FAILED (SKIP=15, errors=1, failures=1)
make: *** [test_cy] Error 1

The numpy error in distributions.tests.test_util.test_scores_to_probs is due to deprecation: numpy/numpy#6464

Not sure why the probabilities are failing.

@fsaad
Copy link
Author

fsaad commented Nov 7, 2017

For the test FAIL: distributions.tests.test_models.test_add_remove('lp.models.niw',) it seems that other models do not return an exact match for the marginal equaling the product of conditions. I'm not sure on what basis the threshold are declared, or whether it makes sense to either report an xfail or adjust the threshold slightly.

@fritzo
Copy link
Member

fritzo commented Nov 10, 2017

Looks like a nuisance random failure to me. I'd recommend either twiddling a random number seed or slightly increasing a threshold.

@fsaad fsaad changed the title Test suite failure on Ubuntu 16.04 Numpy deprecation test suite failure on Ubuntu 16.04 Nov 10, 2017
@fsaad
Copy link
Author

fsaad commented Nov 10, 2017

It seems that lp.models.niw is already known to be buggy, should have checked these issues:

#6
posterior/loom#4

PR #16 addresses the numpy error.

fsaad pushed a commit that referenced this issue Nov 10, 2017
Fix numpy strict conversion error, (related to issue #15).
@fsaad fsaad closed this as completed Nov 10, 2017
@fsaad
Copy link
Author

fsaad commented Nov 10, 2017

Fixed by 173c55d

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants