Replies: 8 comments 11 replies
-
I am in total agreement with your solution. A proxy over all does seem like the most reasonable trade-off. That little bit of security in exchange for having an account. The idea about encrypting an avatar discussed in the Matrix chat was more of a "I authorize this person to view my avatar" message in the form of a decryption key. I don't think you can rip an avatar or asset out of a framebuffer if it doesn't properly enter the framebuffer in an unencrypted state anyway. |
Beta Was this translation helpful? Give feedback.
-
One workaround for this would be adding proxy not as a part of domain server but as entirely new service. Then person willing to accept responsiblity for it would run it. |
Beta Was this translation helpful? Give feedback.
-
The one thing that this specific flavour of security theater doesn't address is that anyone can just take the asset's file they have in their cache and use that. There's no need to download anything once Overte has done that for you.
I'd say that this has overall more benefits than a proxy server because:
|
Beta Was this translation helpful? Give feedback.
-
I haven't closely followed this discussions, apologies if it came up, but I haven't seen it: https://github.com/PlagueVRC/AntiRip. Do people think it would be possible to fork these things in a way that reuses the backend and only adds a frontend for overte avatars, and is still easy to rebase? If so, it could ease the burden of having to come up with and updating an own avatar protection system. |
Beta Was this translation helpful? Give feedback.
-
It's a really cool idea, but sadly not free and open source software. The license is for non-commercial use only. |
Beta Was this translation helpful? Give feedback.
-
Apologies for the necrobump, let me know if this conversation is no longer accepting any more input or has moved elsewhere. Barring such, I'd like to provide my input here as I participated in those conversations as well and was advocating for one of the solutions you don't agree with. Please hear me out. There are two points that I see come up in these conversations over and over and I'd like to discuss each of them in isolation. From your messages directly because you articulated them well, they go as follows:
1. I can just install Ninjaripper and steal any polygon that passes through my VRAMThis point seems to come up because there's an assumption that people don't understand that this is possible. While I can't speak for everyone, I personally deeply understand that once a mesh is rendered on a client, the client can retrieve that mesh from their GPU. It is because of this fact that I am not advocating for futile client side obfuscation or other cat-and-mouse arms race techniques. During those conversations, it was pretty frustrating to discuss asset protection because every new person that joined the conversation on discord felt compelled to reiterate this fact. It wasn't their fault as those conversations were moving a mile a minute and it was impossible to retrace the points of the conversations. So I'm highlighting it here to make sure you know that I'm aware and fully understand the surface areas involved with asset protection, and the solution I'm in support of takes this into consideration. I'm politely begging that you don't tell me this again :P 2. Circles of trust assume that people you do decide to share with won't ever act maliciously.This point I think is interesting and worth discussing because it highlights how we need to consider the people that need to be trusted when evaluating each solution. For instance, can we agree that there is a difference in security policy between the following options:
I get the sense that some people here think there is no difference between sharing an asset with a singular other person and sharing an asset with everyone on the internet. After all, if that single person betrays your trust and rips your asset, then you fall back to it being shared with everyone anyway. However, I don't think that's a good reason to concede the feature altogether and I think that overlooks an important premise of access control. When a user is given access control settings, they are given agency over their own security policy. They get to choose who to trust and create their own rules vs having the game decide for them. The status quo for VRChat, So what feature am I actually advocating for?I think you can already understand what I'm advocating for from my discussion above, but I want to be explicit so there's no question. User assets should be given a setting where users can specify who is allowed to download that asset. At a minimum I think there should be at least two options for this setting:
The options function as one would expect, where assets marked with "Anybody" are allowed to be downloaded by anybody, and assets marked with "Only People I Chose" can only be downloaded by the people specified in the setting. Assets include any UGC, but the main focus of discussion and most pertinent to this thread is the user's avatar. This would be my MVP, but there is definitely more I would love to add. In the interest of keeping the conversation focused, this is all that I'm proposing. I can elaborate more design ideas if you're interested. Usecases and User Stories for such a feature
Common Concerns for this feature and ways they can be addressed.Adding access controls to assets takes what is regarded as an unsolvable problem (asset protection), and turns it into a few solvable problems. Some common critiques people bring up are: What do users see for avatars that they are not allowed to download? We don't want public instances to be full of generic robots.This is closely related to a similar issue that platforms have to tackle, and that is with new users, what avatar do you give them? You don't want all new users to have the same avatar, and so you provide them with a bunch of options during onboarding. The more options you give them, the less homogeneous your platform looks. Instead of everyone embodying the same exact generic robot avatar, you get a mix of whatever defaults you provide. In the same way, access controls can be accompanied by a fallback, where a user appears as one of the onboarding avatars to people that aren't allowed to download their avatar. To take it one step further, users may be given the option to specify their own fallback, which they can fill in with an asset they feel less concerned with having ripped, but perhaps still represents their identity well. Nobody wants this feature so it's not worth implementing.I want this feature. Am I nobody? The trend I see is that technically minded people don't see any value in access control for avatars while sentimental people do want access control. Regardless, I don't think there has been enough effort actually polling the userbases of these platforms to know if there is actually "nobody" that wants this feature. Avatars are meant to be seen, what's the point if nobody can see your avatar?The allowlist would be the people that can see my avatar, and the point is that I want them to be able to see my avatar and I don't want the violation of anybody else being able to rip my avatar. Access control adds too much complexity to a platform and is actually really hard to implement.I'm not sure how to counter this, after all, the maintainers and contributors know better than I do about what is feasible. It's hard for me to fathom, however, that of all the complexities that go into implementing a real-time social VR platform, that somehow access control is an intractable beast that no engineer could pull off. ConclusionWhen I initially broached this topic in the Neos discord, I had no idea it would turn into the shitstorm that it did. I'll admit that the frustration of having a swarm of people bombarding me with dismissive and snarky retorts for hours drove me pretty crazy and made me lose my cool- I wish I could take a lot of my frustration back. Despite that, I am still passionate about this issue in particular and I'm happy to have a constructive conversation with people that want to participate in good faith. I trust that responses will take the time to read my post just like I took the time to read this thread and all the discussion in the forums I'm in about this topic before dismissing me, and will treat me with respect. I want to advocate for the safety and comfort of the inhabitants of these social spaces. users, creators, developers, and friends. |
Beta Was this translation helpful? Give feedback.
-
@CakePost The solution you propose does seem aligned with open source ideals, since it does not require obscurity and does not require closed source client or server to work. If you know someone who has time and experience necessary to implement it, I'll be happy to provide advice on getting started with Overte development or other technical details. |
Beta Was this translation helpful? Give feedback.
-
this is something we explicitly have listed on the asset security page as a feature we’d like to consider: https://docs.overte.org/en/latest/security/asset-security.html#avatar-viewing-permissions as stated there, I believe it would involve adding new fields to FSTs, having the avatar server download those FSTs before sending anything to other clients (which we’ve done in the past but may have disabled at some point?), and then using those new fields to decide what URL to send to who. |
Beta Was this translation helpful? Give feedback.
-
The Problem
There has been a lot of lively conversation today between the Overte and Neos communication channels about the concern and possible solutions for avatar/asset ripping. It's been expressed that a concern about transitioning to Overte is the open nature allowing for anyone's avatar URL being hijacked and their identity/ownership being distributed without consent.
A number of solutions have been discussed (and frankly this topic has been discussed to death), but I'm making the argument that there are and only ever will be two final solutions to this problem:
There has been proposals mentioned that involve various mechanisms of cryptography, providing lower resolution versions of avatars, or circles of trust, but I argue that these are all inevitably subject to failure. Cryptography and circles of trust assume that people you do decide to share with won't ever act maliciously. Serving lower quality versions of an asset is going to be a non-starter for most people once they realize they can't have their 4K textures and fancy high poly details.
At the end of the day if you're not going to make me install kernel anti-cheat, I can just install Ninjaripper and steal any polygon that passes through my VRAM. There are no unbreakable locks, there are only locks that are difficult enough that they deter all but the motivated actors and give us good enough security that low-effort theft isn't possible. Content ownership in a realm where scarcity is artificial is a purely social construct, you can't DRM your way out of a social problem.
The Solution
So once we've established that we only need to create enough of an obstacle for people to pass a threshold of safety, we can start exploring viable solutions that aren't laden in high complexity. Fortunately there is a solution already being used by practically every platform that serves content from an asset server, you simply validate that the client requesting the asset is a trusted source (to the best of your ability) and then proxy the asset URL.
Here's what this would look like in practice:
xyz.com/alice.fst
overte.org/alice.fst
.xyz.com/fst
.This is how platforms like VRChat and Neos work today, as well as anything else sending assets over the wire at runtime. They're not actually hosting the assets on the same domain (or even server) that the server handling the request is on, they're instead proxying the request to a cloud server (typically passing along their authentication for that cloud server). We could be doing the same thing but instead proxying to the asset domain that the user provided.
This would require that users have a user account in order to opt-in to avatar protection, as you'd need to tell overte.org where your assets are actually held, but this seems like a small compromise. This would look like you logging in and providing a URL and a name for the avatar somewhere in your profile.
You could of course go MUCH further with this mechanism and start talking about doing full cryptographic key exchange and even end to end encryption, but now we're just throwing more complexity at a problem that can always be thwarted by running a tool that rips polygons directly from the framebuffer.
With a mechanism like this in place we'd be able point to it and tell users "We don't expose the URL for your asset, we're using a security mechanism similar to other online platforms". And for 99% of people that security theater will be good enough.
Beta Was this translation helpful? Give feedback.
All reactions