Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Proposal] Refactor browser/device detection by adding EnvDetector concept #1620

Open
wants to merge 7 commits into
base: dev
Choose a base branch
from

Conversation

peaBerberian
Copy link
Collaborator

tl;dr

Go from:

import { isFirefox, isPlayStation5 } from "./browser_detection";

export default function shouldDoSomeWorkAround() {
  return isFirefox || isPlayStation5;
}

to:

import EnvDetector from "./env_detector";

export default function shouldDoSomeWorkAround() {
  return EnvDetector.browser === EnvDetector.BROWSERS.Firefox ||
         EnvDetector.device === EnvDetector.DEVICES.PlayStation5;
}

To facilitate both tests writing/mocking and the suggestion/discoverability of browsers/devices for RxPlayer developers.

Overview

I wanted to add tests recently for environment-specific workarounds (e.g. on the PlayStation 5, we have to reload as soon as an unusable decryption key is detected to avoid issues).

Yet our integration tests are currently running on desktop browsers (Chrome and Firefox), so the environment detected by the RxPlayer when it plays a content always correspond to that desktop machine. Running those tests directly on all devices is not easily doable for us either.

I thus propose here a mean to force the environment detected by the RxPlayer, through special mockEnvironment and resetEnvironment functions. I also refactored the ways with which we categorize environments by putting more emphasis into the browser/device duality ("browser" is a name for the web browser such as "Chrome", "Firefox",
"SafariMobile" etc., "device" is a string categorizing the device/OS that browser runs on like "WebOs2022", "PlayStation5" etc.).

Then tests can just directly call mockEnvironment to make the RxPlayer "believe" that it is running on another environment, allowing to trigger its device-specific or browser-specific behavior.

This update also broke unit tests, which I've updated. While doing that, I refactored those tests in what seems to be a more idiomatic way.
Coincidentally, it also makes them much less verbose and they work better with TypeScript. It also makes them pass in vitest's browser mode (which allows to run unit tests in the browser, those for now sadly run in Node.js today), as that mode has subtle differences when relying on the vi.doMock function like we did previously.

Note about tests with a mocked environment

There's however a relatively big detail for which I'm not sure how I'll advance: a browser actually relied on for a test now may be different than the browser "announced" during that test: Let's say that I'm running integration tests on Firefox which has some decoding issues with the current content but I'm currently forcing a Chrome environment to check for Chrome-specific workarounds - there, Firefox workarounds won't be applied though we actually need them considering that's the actual real browser we run on).

One solution for that last problem could be to completely rely on a controlled environment for tests, like running tests in Node.js with jsdom, like our unit tests today, but I do prefer running tests on actual web browsers as that's much more representative of what we'll run on.

Another solution could be to be careful and consider any such issues manually in integration tests, for example by patching some API when mocking the environment. That's the solution I chose for now.

@peaBerberian peaBerberian force-pushed the misc/env-detector-refacto branch from d05cafc to c6858a6 Compare January 28, 2025 14:03
Copy link

Automated performance checks have been performed on commit c6858a6df8586a9b2a7796d3edabe060e527b581 with the base branch dev.

Tests results

✅ Tests have passed.

Performance tests 1st run output

No significative change in performance for tests:

Name Mean Median
loading 18.90ms -> 19.52ms (-0.616ms, z: 2.81295) 26.25ms -> 26.40ms
seeking 16.54ms -> 13.93ms (2.611ms, z: 0.06496) 11.25ms -> 11.25ms
audio-track-reload 25.51ms -> 25.52ms (-0.001ms, z: 1.00518) 37.65ms -> 37.50ms
cold loading multithread 47.27ms -> 47.37ms (-0.104ms, z: 8.55100) 69.15ms -> 68.10ms
seeking multithread 15.20ms -> 13.31ms (1.895ms, z: 1.38818) 10.35ms -> 10.35ms
audio-track-reload multithread 25.50ms -> 25.44ms (0.059ms, z: 0.22354) 37.50ms -> 37.50ms
hot loading multithread 15.03ms -> 14.93ms (0.106ms, z: 3.39574) 21.75ms -> 21.45ms

If you want to skip performance checks for latter commits, add the skip-performance-checks label to this Pull Request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant