Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Eval Staging Environment #268

Open
34 tasks
r-bartlett-gsa opened this issue Nov 8, 2024 · 0 comments
Open
34 tasks

Eval Staging Environment #268

r-bartlett-gsa opened this issue Nov 8, 2024 · 0 comments

Comments

@r-bartlett-gsa
Copy link
Member

r-bartlett-gsa commented Nov 8, 2024

User story

As a challenge.gov team member, in order to test existing and new platform features before they are deployed to production (and are available to the users), I would like a staging environment.

Acceptance criteria:

  • Eval staging environment is setup
  • Accessibility, security and performance scans are implemented

Definition of Done

Doing (dev team)

  • Code complete
  • Code is organized appropriately
  • Any known trade offs are documented in the associated GH issue
  • Code is documented, modules, shared functions, etc.
  • Automated testing has been added or updated in response to changes in this PR
  • The feature is smoke tested to confirm it meets requirements
  • Database changes have been peer reviewed for index changes and performance bottlenecks
  • PR that changes or adds UI
    • include a screenshot of the WAVE report for the altered pages
    • Confirm changes were validated for mobile responsiveness
  • PR approved / Peer reviewed
  • Move card to testing column in the board

Testing (dev team)

  • Security scans passed
  • Automate accessibility tests passed
  • Build process and deployment is automated and repeatable
  • Feature toggles if appropriate
  • Deploy to staging

Staging

  • Accessibility tested (Marni)
    • Keyboard navigation
    • Focus confirmed
    • Color contrast compliance
    • Screen reader testing
  • Usability testing: mobile and desktop (Tracy or Marni)
  • Cross browser testing (tool to be determined) (Tracy or Marni)
    • UI rendering is performant
  • AC review (Renata)
  • Deploy to production (production-like environment for eval capability) (dev team)
  • Move to production column in the board

Production

  • User and security documentation has been reviewed for necessary updates (Renata)
  • PO / PM approved (Renata)
  • AC is met and it works as expected (Renata)
  • Move to done column in the board (Renata)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant