-
Notifications
You must be signed in to change notification settings - Fork 421
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MvNormal Matrix Types and Speed #688
Comments
It would indeed be nice to have this. The issue at present is that we need julia> supertype(PDMats.AbstractPDMat)
Any However, if you already have the Cholesky factor, you can use the covariance constructor that makes use of it PDMat(mat::AbstractMatrix,chol::CholType) to avoid another factorization. I don't recall any reason for not having |
I personally think we should get rid of |
xref #940 |
I'm not sure if this qualifies as an "issue", and I apologize if there is somewhere else I should be posting this.
I have code that runs Bayesian regression with Normal/Gamma independent priors, so I need to draw from an MvNormal distribution tens (or hundreds) of thousands of times. Back in Julia 0.4, I could form an estimate of the variance matrix and plug it right into the MvNormal function. Since Julia 0.5, when the Cholesky factorization got more rigid, I have needed to declare the variance matrix as Hermitian, or I get an error that my matrix is not positive definite (due to computer rounding error). However, MvNormal does not take Hermitian matrices, so I also need to convert to an Array{Float64,2} before using the variance matrix in the MvNormal distribution. This caused a big slowdown in Julia 0.5 (not quite as bad in 0.6). I'm wondering if there could be a more elegant (and possibly faster) way to do what I'm doing, or if the Distributions package could be updated to allow Hermitian matrices as inputs to the MvNormal function.
The text was updated successfully, but these errors were encountered: