Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FAC - Going concern data being misreported #3433

Open
2 tasks done
stucka opened this issue Feb 20, 2024 · 4 comments
Open
2 tasks done

FAC - Going concern data being misreported #3433

stucka opened this issue Feb 20, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@stucka
Copy link

stucka commented Feb 20, 2024

Describe the bug

The "going concern" flag is sometimes being misreported.

I don't know if there is a problem with the data as submitted (someone filling out a form incorrectly) or if something is getting parsed improperly. If the former, perhaps there is value in adding a QA step to show submitting users a list of unusual findings (e.g., when users submit a going concern issue, or material noncompliance, or ...) to verify that's what they intended to do.

@cephillips tells me she's aware of two. I have details on one, provided below.

Steps to reproduce the bug

For a Unity Homes example, visit https://app.fac.gov/dissemination/summary/2023-04-GSAFAC-0000025364

The SF-SAC button allows an Excel-formatted download of the data.

The "general" tab of that data contains a field, is_going_concern_included, that for this report shows as "Yes"

From the original URL, the Single Audit Report button allows a PDF that contains the phrase "going concern" in two spots, both in innocuous uses.

Expected Behavior

Audit data should to the extent possible reflect the accuracy of the audit itself.

Screenshots

No response

System setup

n/a

Additional context

No response

Code of Conduct

@stucka stucka added the bug Something isn't working label Feb 20, 2024
@jadudm
Copy link
Contributor

jadudm commented Feb 20, 2024

@stucka , thank you.

I may have a question or two, please humor me. We all want the data collection to be excellent.

You're saying:

  1. The SF-SAC data reflects someone having indicated that a going concern existed.
  2. The report text does not reflect this.

Is that the concern?

If it is, part of the challenge is that... it's hard to validate. As you say, we could have more per-field documentation that provides more guidance/examples, but at some level... it feels like something the auditor is supposed to know/do correctly?

I'm not trying to dodge, so much as I'm trying to understand what we can do to improve the collection. Am I understanding your ticket/question correctly?

@cephillips
Copy link

cephillips commented Feb 25, 2024 via email

@jadudm
Copy link
Contributor

jadudm commented Feb 25, 2024

Hi @cephillips ; Matt or @jadudm is just fine. :)

For clarity/transparency, this is a Github comment thread, and part of the public record of the FAC repository.

I'd be happy to try and find a time to discuss, if either/both of you have thoughts about how we could help auditors and auditees more easily submit correct audits to the Clearinghouse.

What is difficult is that we have a situation where:

  1. One thing was indicated in the SF-SAC
  2. Another thing was indicated in the report
  3. The auditor and auditee certified everything is correct

The collection process asks that this information be communicated in multiple places (potentially an unfortunate aspect of the collection design at this point in time), but I struggle with how we would reliably (meaning: confidently, 100% of the time) validate that the same information is being conveyed both in the form and in the PDF. It is clear that page 40 of the audit you linked to suggests the answer to going concern is No, but... reliably finding that information in every PDF, and correctly mapping it back to the SF-SAC 100% of the time is a non-trivial task.

We are committed to doing what we can to improve the quality of the data collected in the SF-SAC and SAR, but also are constrained in a number of ways. We cannot, for example, apply a validation to the SAR (the PDF) that is not 100% reliable, if we are going to "gatekeep" on submissions.

We might be able to do some kind of analysis where we attempt (for example) to determine if the going concern field is correctly represented in the SAR. But... what if the PDF is too poorly formed to reliably find this information? Should we reject the audit? Do we... issue a warning? What if that warning is wrong 60% of the time? Should we issue it regardless? (Or, what "correctness" threshold is acceptable for a warning, and how do we ascertain what the correctness is?) Or, perhaps this is something you already have a sense for?

If you have thoughts, tools, or techniques you would like to recommend, we would love to discuss. For example, looking at

https://biglocalnews.org/content/tools/audit-watch.html

if you have open tooling for the PDF analysis that you would be willing to help us understand, we could look at what it would take to incorporate it into the FAC. If you even have a sense for the percentage of audits this is occurring in, that may be of value, and something we can leverage as we attempt to better support auditors/auditees in their submissions. If you reach out to us via the help desk, we can find a time to discuss, if you like.

@cephillips
Copy link

cephillips commented Feb 25, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants