Hacker News new | past | comments | ask | show | jobs | submit login
VRchat bans mods, embraces EAC (vrchat.com)
246 points by cowtools on July 26, 2022 | hide | past | favorite | 270 comments



I'm not a VRchat user. But it's sad they decided to ban modified clients. Giving the community a way to customize their experience is a great way to grow an ecosystem and learn about what your users want.

The problems they mention are all solvable without banning mods:

> Malicious modified clients allow users to attack and harass others, causing a huge amount of moderation issues.

These kinds of issues should be solved on the backend. Rate limiting, pattern matching, etc.

> Every month, thousands of users have their accounts stolen, often due to running a modified client that is silently logging their keystrokes

Keep a list of "blessed" open source clients that have a reputation, and allow people to report them. Chrome and Firefox extensions have solved this

> Additionally, all modified clients – even ones that aren’t malicious – are a burden for creators. We regularly speak to many that have spent hours (or days) debugging user issues, only to realize that the culprit is a modified client.

Provide better debugging info for your creators! They should know which client (or at least that a modified client) is being used.

> This pain extends to VRChat support too – any time we update, we get a massive amount of bug reports that end up just being broken modifications

Ditto. These tickets should be easy to auto-close if you force folks to include info about their client.

I understand all this requires time and effort, and simply restricting the client-side of things is a low-cost solution. But if there's already a vibrant community of mods out there, I think you have a responsibility (and a vested interest) in trying to preserve that.


> These kinds of issues should be solved on the backend. Rate limiting, pattern matching, etc.

never seen that work in practice.

> They should know which client (or at least that a modified client) is being used.

That's the modified client's job, not the original client's job. if the modified client is intentionally trying to be hidden, then its not going to show itself. Which is the cause of the whole everything

> Extensions

https://docs.vrchat.com/docs/what-is-udon already exists

> But if there's already a vibrant community of mods out there

is there a vibrant community of mods? besides, they have actively told people to bugger off if they use a custom client for a while now, by the sounds of it (e.g. the two mods I could find, both said that you will likely get banned for using them)


>is there a vibrant community of mods?

The most popular client mod is emmVRC:

https://www.emmvrc.com/

Other mods exist, mostly for nefarious purposes like crashing other users who're on Quest or the vanilla client. There's a whole mini black market of "hacks", i.e. client mods built on abusing the VRChat API, as well as some clients which tie into DDoS services that will target specific users in-game if you manage to obtain their IP address via exploit or social engineering. These malicious clients are generally advertised and sold via Discord links in users' profiles. PayPal or CashApp are used to facilitate the transactions.

The game's moderation is essentially nonexistent. Anyone who does get banned for using mods, or sexually grooming underage users, or running around in nude avatars with giant ejaculating phalluses, or all of the above at the same time, can make a new account in a split second. They don't ban IPs, and only recently blocked most well-known VPN IP ranges.

I've got a folder full of OBS screen captures of ~a hundred inicidents like those just described. They don't tell you if reporting a user results in moderation, to "prevent bullying," but I can tell you that users I recorded months ago running modded clients and sexually grooming children (adult with NSFW avatar explaining earnestly why incest is OK to preteens) still log in daily. I stopped filing moderation tickets when I realized nothing was happening. It's an absolute shitshow, a media exposé waiting to happen. I haven't even mentioned the staff-curated default avatar worlds with Disney and Viacom copyrighted characters.


The comment I replied to got flagged so my reply to it’s hidden too, but here are some details on the crashers, malicious mods for sale, and DDoS groups I mention above:

——

Crashers, many of which are Quest-specific, some of which are world or avatar-based and sone of which are built into mods, are common and extensively documented:

https://youtu.be/tm0qVevUMJU

https://youtu.be/NW1XvhCzlts

https://youtu.be/Eeuf3EGgYJI

I know about the DDoS services because I followed the Discord links in the profiles of users affiliated with them, some of whom crashed me for sport in public worlds. I pulled their usernames from the logs and looked up their public game profiles. For example (sketchy link warning, may be NSFW):

https://photon-gods.cc

http://svh.natt.pw/

IP addresses of other users could previously be extracted due to VRChat’s old P2P netcode. This is a security design flaw in an MMO with a community as toxic as VRChat’s. This has now been mostly but not entirely patched (if you join a world with an older type of video player, any user in-world can submit a video URL that your client will directly fetch).

——

To be clear, banning mods and enforcing EAC won’t fix these problems. Improving moderation and patching the holes exploited by malicious mods will fix the problem.


> It's an absolute shitshow, a media exposé waiting to happen.

It's already happened. In the UK on Channel 4 at least: https://www.channel4.com/programmes/inside-the-metaverse-are.... It shows many of the incidents you've described. Including things that you probably haven't experienced (the player was a black woman, so you can imagine what she experienced).


That’s bad. However none of the issues they have are unsolvable. Just add EAC and provide an official mod api through a workshop or launcher. Creating a healthy eco system of moderators is doable as well.


> running around in nude avatars with giant ejaculating phalluses

I can kinda see how this is fun the first time, or in a very specific age range, but I think the novelty would quickly wear off.


Ever heard of Eternal September?


Why would they ban their primary userbase?


[flagged]


>You have no idea what you're talking about regarding mods. Like, at all.

Enlighten us!


You're the one who has to explain yourself.

> Other mods exist, mostly for nefarious purposes like crashing other users who're on Quest or the vanilla client.

"Mostly" has statistical implications. Where are you getting this statistic? Do you have any data to back what you're saying? Which are these mods, and what exactly do they do? Are you saying there's a rich ecossystem of software specifically designed to interfere with the experience of Quest users? How do you even know that?

> There's a whole mini black market of "hacks", i.e. client mods built on abusing the VRChat API

This statement makes no sense. What VRChat call their "API" is a backend for wrangling metadata. Access to this backend is available to all user accounts and generally not discouraged. There is a message in the HTTP headers inviting users to apply for a job there.

> as well as some clients which tie into DDoS services that will target specific users in-game > These malicious clients are generally advertised and sold via Discord links in users' profiles. PayPal or CashApp are used to facilitate the transactions.

So these are standalone applications that cost money? Is the money to pay for denial of service botnet time, or a one time cost? How do you even know? And I don't see how a criminal operation of this calibre will be stopped by EAC.

> if you manage to obtain their IP address via exploit or social engineering.

IP addresses are public data and denial of service attacks are a crime wholly independent from any platform. If you already have the user's IP address, anything to do with VRChat and mods seems redundant.


I will gladly explain, I’ve spent hundreds of hours in VRChat and many hours digging into its darker side.

Crashers, many of which are Quest-specific, some of which are world or avatar-based and sone of which are built into mods, are common and extensively documented:

https://youtu.be/tm0qVevUMJU

https://youtu.be/NW1XvhCzlts

https://youtu.be/Eeuf3EGgYJI

I know about the DDoS services because I followed the Discord links in the profiles of users affiliated with them, some of whom crashed me for sport in public worlds. I pulled their usernames from the logs and looked up their public game profiles. For example (sketchy link warning, may be NSFW):

https://photon-gods.cc

http://svh.natt.pw/

IP addresses of other users could previously be extracted due to VRChat’s old P2P netcode. This is a security design flaw in an MMO with a community as toxic as VRChat’s. This has now been mostly but not entirely patched (if you join a world with an older type of video player, any user in-world can submit a video URL that your client will directly fetch).

I’m curious why you so energetically promote a false, rosier view of the situation.


Glancing at the links you provided, I see some unfounded rumors and some (three?) references to the creation of avatars that overload users, causing them to crash (including one of the features in the bottom links). It's common knowledge that this is possible, and VRChat should probably do a better job of analyzing avatars at time of upload.

EAC does not prevent this in any way, shape or form.

I see what you mean now about abusing the API - they seem to be invite bombing or similar. This is not what DDoS means. If possible, this is definitely a rate limiting issue to be solved on the backend. EAC does not prevent this.

The client in your last link appears harmless at a glance, though maybe if I dug enough into it I'd find something unsavory. Did you look at what it does?

I don't think you have provided any evidence of harmful modding that will be affected by this patch yet. It's easy enough, on the other hand, to locate the github repositories for the hundreds of harmless mods that tens of thousands of community members use regularly, and which you don't seem to know about if you think these videos suffice to describe what mods are "mostly" used for.


It sounds like you’re a VRChat mod user who is unhappy with the new update, and thinks I am defending it based on the bad mods it will ban. This is not what is happening. I am ambivalent to negative on EAC. The purpose of my comment above is to raise awareness of the bad parts of VRChat generally, especially to HN users unfamiliar with this game and its risks for younger users. I think EAC is a bandaid that will negatively impact good users and have minimal impact on bad actors.

Many mods are used for malicious purposes. This doesn’t mean ‘ban mods,’ it means patch the holes bad mods exploit.


I agree with your sentiment there for sure, but this is a submission about EAC. You have filled it with off-topic comments in a way that makes it seem you don't understand what the conversation is about. You made a very misleading statement when asked about the modding community that strongly implies it's mainly nefarious and exploitative, and that's what I replied to.

You know who has patched some of those holes you're talking about? Modders. There are mods that mitigate crasher avatars.


I submit that the problem is actually bigger than you think, there are in fact more malicious mods than you realize, and drastic measures should be taken. I don’t think EAC is the right fix and I think it’s being made for the wrong reasons.

Most regular users don’t use or know about anti-crasher mods, and in any case the problems are inherent to the game’s design. Fixing those problems should be VRChat’s main goal.

The OP is about VRChat mods. I started talking about malicious mods and segued into bad stuff on VRChat in general, which I think is both relevant and interesting to HN users. If you don’t like it, I encourage you to help remove the bad stuff.


Hey since you claim to be a knowledgeable VRC user - which I have no reasons to doubt - do you think it’s all about those crashers or could it be also to stop avatar rips, maybe foreshadowing microtransactions?

I’m completely uninitiated on VRC, but I do see some artists who have been offering VRC avatars and accessories through external services like booth.pm for couple years by now. Rips and thrown-together pirated models was a huge problem at least initially, and I would imagine would only be helped by mods.

Could this be that VRC is finally moving onto integrating avatar/editor marketplace and locking down on “illegal” avatars as part of that move?


I think avatar data is stored in a cache file, so banning modified clients won't necessarily prevent rips.


For the sake of understanding the problem, any avatar marked as public can, or at least could (I can't say I've looked into this problem recently) be obtained in the following manner (DON'T do this, as it's still both a violation of the TOS and copyright infringement):

- Change into the avatar.

- Ask the VRChat API (backend) for information on yourself (this requires no special permissions).

- Use the link contained in the response JSON to download the avatar to your computer (it's how the official client obtains it, too)

- Unpack it with literally any Unity asset pack extractor, of which there are several. The asset pack is not secured in any way.

Note how mods are conspicuously absent from all of these steps.

I'm an avatar maker myself, and at this point my stance is that I can't be responsible for people stealing assets from me that I'm not allowed to redistribute, but I'm open to directly giving anyone anything I'm allowed to redistribute, and to teach them how to implement, download or purchase everything else. It's a load off my mind.


>never seen that work in practice.

Linden Labs did just that -- and almost every other better-practices VR thing -- and it worked just fine.

Linden Labs and their teething problems should be the bible for anyone starting a 'VR-experiences' style platform, but for some reason their past struggles have been widely ignored by the newer players in the field.


The problem is that if everyone with "metaverse" hype actually looked at Second Life[0], they'd realize that there's not really much to get hyped over. The final and most important lesson of Second Life was that most people do not want a 3D world chat room.

[0] Or ActiveWorlds, or the metric buttload of VRML chat services before that...


Yep. Some people do, they still have an audience, but it's not the revolution it was supposed to be.

It's also a great example of a hype cycle - it was going to be the future, it was going to be everything, corporates had offices in there, universities had campuses, there were virtual music events and art galleries, there were real-dollar millionaires made from virtual real-estate.

When the hype died down and the smoke cleared, there were still people wanting to use it, but not that many, and not in the way we were all told to expect, and the corporates realised they didn't need a bank branch or an SL office full of virtual meeting rooms in there after all. Neat sandbox with interesting ongoing uses, not the new centre of all our online lives.


Makes one think if remote work is going to find a home in those online worlds. The mechanics make sense and it gives a sense of togetherness.


And this will hopefully mean the end of Zuckerberg's expansion. It would be totally brilliant if this ended Meta.


I think COVID and GenZ growing up on the internet has changed the equation. Will the metaverse take over the world? Probably not. But now that knowledge work has gone remote, entire countries remain blocked to tourists, and people are coming down with flus they need to isolate for 10 days for, I think having a virtual sandbox to socialize in has more appeal.


I think you missed the actual answer. There is a near future sci-fi book called Rainbows End written by a computer science professor 15 years ago (and that predicted a lot of stuff, like... cryptocurrency) that has an answer for this.

What will be the real "VR metaverse" for most consumers is the one that maps to reality. The AR metaverse. Being able to visit people in real life in real locations, virtually: augmented reality. Where the virtual reality part of that is a small extra side bit for the dedicated people.


I agree with this. I'm just saying in my opinion, with our current tech, VR and the Metaverse isn't something to be ignored. AR can definitely be bigger but it's important to remember that thousands of knowledge workers have begun their career never interacting with another person in a workplace.

While there used to always be soft pressure to structure your life in the real world, with work and school online, folks who otherwise would be pressured to interact offline might now find interaction online to be just as fulfilling. Or at least equally fulfilling for things like work while saving quality, offline time for friends.


But none of that looks like an excuse for VR to exist. That generation of remote first, they would be less interested in an hmd-based simulation of a meeting room, not more interested. They want a good camera, to be a better face on a screen, not better screens on their faces.


Of course and they are doing this, there's huge demand for nicer cameras, ring lights, backlights, etc for upcoming streamers and influencers. But this isn't a zero-sum game. It's not like we have to Zoom every hour of our work lives. Startups like Gather Town are predicated on finding a permanent, but not total, place in the future of remote work. If my company pays for a subscription to Minecraft For Work (TM) so that we can have remote tech talks or remote social events on it, then there's value, even if it's only used a couple hours per week by per team at a large company.


This is the exact same proposition that Second Life had, 15 years ago. Cross-continental teams can 'meet' in a shared virtual space and it'll be just like you're there! And after all the hype ... it just wasn't something that people did.

I know there are headsets now, maybe it will go differently. Maybe not.


Not being in Gen-Z, perhaps I don't get it.

But having seen virtual worlds come and go multiple times, I'm going to assume that it's another hype cycle until there's compelling evidence to the contrary.


Agreed.

You might as well not be a person if you live in a virtual world...necessities like money and basic human needs mean people outgrow their fantasy world, and their fantasy games. Moves like this cause fragmentation, such that there is no one virtual world.

As your userbase gradually dwindles to nothing...they grow up and have lives. Or don't, and are usually characterized by low paying jobs, poor health.

Ofc not every low paid worker is there by choice, and some are born disabled, but on average these games are a cause of their plight.


> Or ActiveWorlds

Every time I see news about the "Metaverse", I remember the articles Shamus Young (RIP) wrote about ActiveWorlds and the DotCom bubble crash:

https://www.shamusyoung.com/twentysidedtale/?p=35399

The metaverse doesn't seem quite that bad (Facebook's people seem to have an understanding of the medium they want to operate in), but the basic "we're going to make the user experience worse by replicating the limitations of the real world for the sake of realism" mentality is still there.


I ve been inviting people to try SL (opensimulator, the open source version of SL) and most people actually like 3D worlds. Many people don't even know these things exist because their mainstream experience is mobile games and flat websites. But SL has a terrible user interface and experience and it's so complicated that it alienates those users. Even an advanced user would have a hard time figuring out all those legacy things and concepts, it's the Unix of VR worlds.

As for the metaverse, people need to realize that it's not for everyone . Maybe in 10-20 years or so


The entire point of second Life is to have an infitely customizable character with user created clothing, etc. VRChat avatars are fully backed 3D models that require advanced modelling skills even for something as basic as changing the colour of a t-shirt.


>The final and most important lesson of Second Life was that most people do not want a 3D world chat room.

Isn’t this is what most teenagers use Fortnite and Minecraft as.


On the internet “most people” are an imaginary statistic we toss around thinking it means anything.


Roblox and Minecraft both are good examples of Worlds where people meet up to socialize and play games.

I'd say, Roblox is more sticky, as there's a built in friend list and chat/messages without being in game.


No, not really. You're thinking of Discord.

Fortnite and Minecraft do have chat features in them, but those are features tacked-on as a last resort. In the case of Minecraft you don't even have voice chat. In the olden days of the Xbox 360's heyday[0], all we had were one-on-one voice channels or in-game chat, the latter of which would be an absolutely toxic nightmare of getting shouted and screamed at.

Furthermore, Fortnite and Minecraft are games whose camera and movement controls are optimized for the specific game mechanics they feature. If you want to create a generic 3D world, then you have to also create a generic set of controls to move and look around that world; and those controls are always going to be awkward and difficult to use.

Maybe you built a regular house, in which case Sims-style point-and-click-to-move and a fixed camera angle would be your best control option. Or maybe you built a racetrack with cars, which means you want WASD-steering with a car-locked third-person camera. Your FPS level wants first-person or over-the-shoulder third-person.

Second Life tried to do all of the above, so they gave you a default behind-the-head camera that vaguely works in open spaces, but you have to use Maya-style camera orbit controls to look at specific things. You can scroll to zoom to first person, but you can't use any of the builder controls in that view because of mouselook. And all of this was complicated enough that Linden Labs felt it was necessary to add "easy" movement and camera controls driven by UI buttons, but not to add click-to-move for third-person mode.

The thing is, when I look back at Second Life I can't really think of a way that it could have been done better. I can think of features that it's missing, but that wouldn't fix the clunkiness of the controls. Anything I could think of to fix camera movement would bias in the direction of a particular 'genre'; and making people switch between "FPS mode", "car mode", and "hanging out mode" would be a total pain.

This isn't even getting into the other problems these projects have, like the fact that 3D worlds are really expensive to build, or that the copyright maximalists will have a field day suing you over all the pirated content that your users will inevitably port in.

[0] As in, "blades dash" era 360


They use Fortnite and Minecraft as something to do while they chat in another client, usually Discord. Modern multiplayer games have largely reduced (or eliminated) communication with people you aren't already friends with.


SecondLife, Fortnite, Roblox, etc (even VRC on PC) are what Matthew Ball refers to as "proto-Metaverses" as in 3D experiences locked inside 2D screens. But you can't compare those experiences to what is possible when you can see 3D in 3D either overlayed on top of reality or in a totally animated world space. It's Apples and Oranges -- yes they are both fruit; but you'd be very wrong indeed to think because one is good in a pie the other must be.


Personally, I hadn't thought of it this way (mostly dismissed the idea). I'm sure it's a much different experience and one would have to experience it to properly judge it. I think the difficulty with the other, more generic, platforms was that the question of "what is the point" came up, a lot. And when users got to define "the point", sometimes "the point" was pretty disturbing ... most of the time it was just boring.

My biggest concern is that I've yet to find a VR headset that I can use for more than about ten minutes before feeling nauseated from motion sickness. I'm told this is somewhat/mostly solved and I've used higher end gear but it's been the same result for me every time.


> My biggest concern is that I've yet to find a VR headset that I can use for more than about ten minutes before feeling nauseated from motion sickness.

Sometimes I see these issues being brought but and I always wonder if they are associated with experiencing applications that make your virtual body move artificially without actually moving in real life, instead of staying intact like in Beat Saber or jumping by teleportation like Half Life: Alyx allows?

It seems a popular mistake for VR newcomers is to get in—or be put in—a rollercoaster or drive a car where these kind of problems are very visibile.


There is a vibrant community of mods. There are more than 30 thousand people in the modding group Discord. Compare to less than 70 thousand people online in the platform at time of writing - a sizeable minority of the community uses (and a subset of them needs) mods. There are more than 200 safe working mods right now.


Udon is not a modding API and was never intended to be one. It is purely a world-centric scripting system, which is great for building minigames inside VRchat, but not for extending the client's features on a global basis.


> These kinds of issues should be solved on the backend. Rate limiting, pattern matching, etc.

> never seen that work in practice.

Isn't this solved by basically every game with a persistent economy? If e.g. Eve online or Runescape just trusted the client they would have fallen apart years ago (not mentioning WoW until the next para because its economy is less important/more isolated to each individual player than a lot of other mmos).

Afaik in World of Warcraft (and by extension probably a lot of other similar games) a bunch of important stuff is totally serverside - your current HP/Mana/Inventory, skill cooldowns, buffs etc. None of that is controlled by the client, and the best you could do by hacking it would be to get it to temporarily show information that was different from what everyone else sees, and couldn't affect any other players.

Movement in WoW (afaik) is a bit more of a hybrid thing - it's a mixture of the client immediately responding to player input, and the server only resetting player position/kicking people off the server if it determines that the player could not have performed that movement with their maximum speed.

With a lot of games this isn't actually that hard either - you just encode rules about what players can do (max movement speed, rate limit amount of chat messages per second, abilities that that player has access to etc) and kick people from the server if their client claims they're doing anything impossible.

In competitive shooters it's a bit different - you don't want to let clients send you technically legitimate "I shoot in this direction" commands that come from an aimbot instead of a mouse, so a multifaceted approach is important here. But VRChat isn't a competitive shooter.

With turn based games it's really trivial, just have clients send you their decisions for each turn, run all the game logic server side and have the client be a dumb UX that plays the animations it's told to.

===================================================

Having an authoritative client and just trying to detect cheating on the server is very difficult/impossible - but that's because it's totally the wrong way to solve the problem in the first place.

You want to have the backend doing all the important logic, and have the client be as dumb as possible (or write a deterministic game engine and have the server replay the same inputs at the same time from the same state and boot people when the outcome doesn't match what they told you happened, but this requires engineering up front - it's not an easy thing to retrofit into an existing game).

===================================================

Edit: Realised WoW is a good example for mods too - the client is moddable, but there's a clearly defined API for what mods are allowed to see/do. It works pretty well.

Doing this the right way is not impossible, it's just probably more engineering effort than the devs of VRChat are willing to put in.


> Doing this the right way is not impossible, it's just probably more engineering effort than the devs of VRChat are willing to put in.

Those guys push major updates on friday evenings, introducing new bugs and causing clients to break right when the weekend parties are scheduled to happen. I doubt it's a matter of "willing" and more about competence.


> multifaceted approach is important here

Valve is pretty good at this, and they don't even use kernel level anti-tampering as a facet. Trust factor, VACnet (machine learning) and overwatch (community demo review) add up to a system that's pretty decent at not matching you with cheaters if you aren't one.


> Isn't this solved by basically every game with a persistent economy? If e.g. Eve online or Runescape just trusted the client they would have fallen apart years ago (not mentioning WoW until the next para because its economy is less important/more isolated to each individual player than a lot of other mmos).

These is easy when a couple seconds latency is acceptable. In an action game, as opposed to an MMO, it isn't. 100ms is likely to lead to a very subpar experience.

Thus you end up doing stuff client side, even if you sync up with the server a couple times a second.


A couple of seconds latency is basically unplayable in any genre (except for turn based stuff I guess). When I used to play WoW latency was generally < 100ms.

You can paper over latency by immediately showing animations and stuff clientside, and rolling back if the server decides it didn't happen - most games do do this. But there's not a multiple second gap between e.g. you activating a spell in WoW and the outcome being registered by all players involved - that would be actually unplayable as you say. There's also sub second cast times combined with spells that interrupt/stun those other spells.

Nowadays there's not much difference between a lot of MMOs and what I assume you're referring to by action games - see New World[1], Black Desert, Tera, Elder Scrolls Online for a bunch of examples of games that have fast paced combat with small cooldowns and reticle aiming while also being full fat MMOs.

Besides, this thread is about VRChat, which is definitely not an action game.

[1] https://www.youtube.com/watch?v=a8JiJN8942o


Udon isn't an extension.

There are many mods, authors, and tens of thousands of hours of work put in by game devs.

It's disrespectful to be commenting about a subject you don't have experience with. It's only muddying the waters.


> never seen that work in practice

it often doesn't work out even if you do ban modified clients. there's an impressive array of hacked clients for many games out there.


> never seen that work in practice.

Except in every competitive online game that predates EAC. Rule #1 in networked games is "don't trust the client".


I’d also invite anyone interested in this space to look at both the rule for World of Warcraft addons and their evolution over time. Many actions are doubly rate limited. Not only can you not spam a single action, you can typically only call one rate limited API action per event handler. So you can make a button that can do one of three actions depending on state, but if a second fires it generates an error.


Until you get a Lua unlocker. Because those limits are enforced by the client, not the server.

I’m not arguing that there’s no rate limiting going on in the server, but some things such as not being able to call several api commands with only one hardware event are enforced client side because by definition the server has no idea about whether a hardware event happened.


Yes and no. The rate limiting can still be done server side. Reacting to events is simplest on the client, but if you were being a hard ass about it you could do something with correlationIDs. I think it depends on how much state history you want to retain on the server to map actions to outcomes.


That’s an endless arm race. They are complaining how much time they wasted on mods


Even if it is an arms race, adding anti-cheat software will not make the client trustable. The arms race, as you say, will continue.

Even Ring-0 anti-cheat software hasn’t kept Valorent cheat and abuse free.

The server is the only software not in “malicious” hands; it’s the only place anti-abuse checks are viable.


Pretty much all AC focused on protecting the client and game memory was already ring 0 before Valorant even came out.


How is "don't trust the client" an arms race? But now we're moving past the simple empirical fact that for countless games it worked, and still works.

> They are complaining how much time they wasted on mods

You mean their complaint that they wasted time on bug reports coming from modded clients? And instead of using the obvious, dead simple solution of appending a client ID to the bug report to filter out modded clients, they prefer to use this as an excuse to add DRM?

Guild Wars had a similar issue of getting bug reports from clients due to hardware errors. Did they ban all PCs not built by an authorized OEM? No - they simply added a hardware health check to their game, and ignored bug reports by clients that failed it.


> How is "don't trust the client" an arms race?

Exactly. You write the game logic, you know what the client is and isn't allowed to do. Just ban/kick if the client tries to do something it isn't supposed to do.

The only thing stuff like EAC should be needed for is stopping people from delegating making their game logic legal inputs via an aimbot or some external script instead of their actual fingers, which doesn't seem relevant for VRChat.


Another reason for EAC would be a kind of cheat people built for a recent Battlefield game: a modded graphics driver that changed colors to make other players more visible. There's just no way to detect this server side.


That doesn't seem very relevant for VRChat, either.


>Just ban/kick if the client tries to do something it isn't supposed to do.

Of course couple this with a deep understanding of what your client actually can do. Oh, and keep track of updates you'll have a lot of banned users.


The "dead simple solution of appending a client ID" would completely fail if the modified client sent the Client ID of a supported client.


Correct. The problem is users not mentioning they have modified clients when reporting e.g. avatar rendering issues to individual modders. The users aren't being deceptive, they're just non-technical


Welcome to the fruits of a computing industry that hasn't focused on making computing more accessible cognitive load wise in decades.

Specs locked behind paywalls/licensewalls. Closed source API:s, lock in via cryptography... It all comes together to create just what large software platforms want, but what people like Engelbart were trying to avoid.

A technically ignorant/illeterate user populace.


Cheating was rampant in those games? Even nowadays the most liberal modding policy for multiplayer games is "modded copies cannot play with unmodded copies" to try and cordon of modded users from being able to scam/grief/etc the general player base.

Is there any online game that has permitted modded clients playing with unmodded clients and had it go well?


In a game this makes sense, cheaters ruin the competitive aspect of the gameplay. But here people are objecting because there's no competition. VRChat's claims are that modded clients result in security exploits for unsuspecting users and can cause instability for non-modded users. So the question arises, can VRChat plug these problems without resorting to anti-cheat? Was anti-cheat just the lazy option, or are there inherent aspects of the client which make it hard to plug these security exploits or the instability?


Minecraft. World of Warcraft. FFXIV. Just to name 3 off the top of my head.


It's worth noting that WoW addons also backfired at certain points but for different reasons. Example being AVR that drew shapes over the 3d world marking boss mechanics, essentially trivializing them and leading to Blizzard locking out that API entirely from addons. It didn't affect other users, though gave a competitive advantage to some.


And there is a hell lot of complaints about cheaters on hypixel, no?


Does squeenix explicitly allow FFXIV mods now?


Ehhh. I'd say they tolerate it.

Their policy is basically "Don't Ask, Don't Tell". So as long as you aren't yelling at people about DPS meters or streaming with obvious mods, they'll leave you alone.


> Cheating was rampant in those games?

But what kind of cheats? Aimbotting and seeing through walls don't apply to VR chat. And no matter how modded your client was, you couldn't spam endless grenades in Counter Strike - the server knows how many grenades you have, and where you can throw them.


Aimbotting/wallhacks are relevant, as there are games within VRChat where those are relevant.

Due to starcraft 2's design, defogging hacks were plentiful.

Anyone who has played TF2 for a significant amount of time has run into hackers landing impossible shots, e.g. through walls

before EAC, there has been VAC (along with two separate community-ran anti-cheat services for source games), Battleye, Warden, etc. It seems universal amongst those involved, that client-side anti cheat helps way more than you expect.


I'm gonna play devil's advocate here.

I pretty much agree with all your points, but what I see here is that VRChat developers or product people are choosing the solution that requires the least effort. They may have other features or fixes in their roadmap and decided to squash all the fixes into a single one: banning them.

If I were them, given the popularity of modified clients I'd try to turn that into a feature that sets them appart from other competitors, but I still think it's a rational choice to ban them given how much developer time costs.


Here’s the problem: they trust the client. The client is in the hands of malicious actors. Anti-cheat - even ring-0 anti-cheat - does not make the client trustable. The client is still in malicious hands, and that trust will continue to be abused.

Their “simplest possible solution” has not solved the problem.


More than this, it sounds like a serious net negative to me.

Legit users who had modded their clients to have some perfectly fine features (such as performance enhancements) are SOL - they aren't gonna risk it just for features they've lost, they either gonna live with whatever's left or just abandon it entirely.

And those with malicious intent are gonna just work around the anticheat (because why not - what they're doing is most likely was some bannable offense already) and continue. Yes, there's some barrier but just the other day we had a thread where DMA hacks were mentioned (and PCIe cards for that are pretty cheap) - does EAC has anything against those tools?


Don’t let perfection be the enemy of good

We don’t put locks on our doors because they’re unbreakable but because they are the right compromise between cost and benefit


If the true goal is to enable creators, why not let the creators choose whether to require EAC to connect.

It should not require much extra developer time at all to let both users and creators choose whether to join an EAC or non-EAC instance.


Excellent idea, and in theory others could create better alternatives for centralized moderation.


That's what I was going to say. The bullet points sound like a ton of work. And I'm sure the devs put considerable effort into exploring these points. There may have even been code in the repo to address these points.


EAC is about protecting game files. This is to protect cosmetic sales, nothing else.


You can already import any models you want (read: models you purchase from eg. gumroad) by uploading them through their unity sdk.


> These kinds of issues should be solved on the backend. Rate limiting, pattern matching, etc.

This is a crazy hard problem that is plaguing the majority of our mainstream social platforms. Looking at Twitter/Facebook/Reddit etc. moderation+harassment is mostly solved by brute force community management, and is extremely costly, with a human component that is hard to ignore (Facebook moderators are plain miserable and prone to stay mentally hurt even after leaving the job)

This is where Nintendo decided to bail out of the problem altogether by blocking personal interaction by default on their online platforms. If they don't have the resources to do it properly, limiting the attack surface feels like a sensible strategy.


>This is a crazy hard problem that is plaguing the majority of our mainstream social platforms.

...that rely on advertising.

It's comparatively easy to moderate communities if you're willing to permanently ban bad actors. Trouble is, that shrinks your audience, which reduces advertising revenue. Reporting accounts on Twitter, Facebook, Instagram, et al, is a multi-step process of making excuses for why your reported post won't actually negatively impact the malicious account.


As weird as it feels, this might actually be one of the very few cases where platforms are willing to throw users under the bus for better ad money...

Youtube demonetizing creators with too aggressive stances is an example of that, but we also had Reddit banning whole subreddits when their sheer existence had a nefarious effect on the platform's ability to negotiate ads.

When the choice comes down to more active users but lower prices offered, vs less users and better ad prices, platform tend to go for the latter I think (4/5/8 chan being exceptions)


> Provide better debugging info for your creators! They should know which client (or at least that a modified client) is being used.

> Ditto. These tickets should be easy to auto-close if you force folks to include info about their client.

As a (former) modder, though not for VRChat, I can definitely empathize with this. Code quality with mods is a freaking wild west where modularization isn't really a thing and everyone monkey-patches everything. Supporting your mod/content amidst other mods is in the gray area where the etiquette expects content creators to support their stuff even though there's no "warranty" on the label, so you get significant variance in how demanding players are or how accommodating content creators are.

That said, I don't think anti-cheat is the answer here. Officially supporting mods and driving modified clients down a predictable path where you'll know when and how a client is modded seems like a better approach. Modders benefit from having real infrastructure to work with, and as the game dev you can ask players to turn off mods for troubleshooting.


Yeah, I really don't get it here.

For competitive games, I can see the argument and demand for anti-cheat.

Here, though? Any actual problem it would solve is better addressed either server side or on the counterparty clients. It seems backwards to require full control of all client software rather than filtering out spam on the receiving and/or intermediary ends.


Modified clients were already banned; this is a technical measure to enforce that policy.


It’s a long list of excuses and they know it. It’s harder to monetize third party clients just admit it.


I find that very naive. Moderating online communities is famously difficult and unavoidably expensive. This is well documented going back to MUDs on dialup BBS systems.


> These kinds of issues should be solved on the backend. Rate limiting, pattern matching, etc.

Isn't VRchat... Spoken? It's kinda hard to rate limit and pattern match speech. Text is one thing but it's going to take some compute power to do what you're asking.


> Malicious modified clients

hehe there's a spiderman knuckles that spams webbing and it fills the entire room and you can't see anything it's funny.


That's hilarious /s


Who else is just depressed as fuck to realize that our VR experiences are likely going to be hamstrung by things like EAC, and there's probably little that we'll be able to do about it. This is not the metaverse future that I had envisioned. I guess they think I need to be protected from a "Snow Crash".


As a frequent VRChat user: the problem is entirely sloppy API design, poor to nonexistent moderation, a security team detached from reality (the team lead posts often in the VRChat Discord, or used to), and the fundamental problem of running a no-stakes social game where getting banned and losing your account is no obstacle to further bad behavior. Getting banned from WoW means you lose your time investment in level 60 and raid gear; getting banned from VRChat means you make a new dummy email and log right back in. Any VR game with real time investment doesn't have this problem.


Don't be, you're being a Doomer. For every negative experience, there can be positive ones. Check out Mozilla Hubs [1]. It's FOSS and runs in browser. The metaverse doesn't have to be negative. That said, VRChat's problems have everything to do with the abuse that mods entail rather than any inherent "Snow Crash"-iness to the situation. It's the age-old fight between bad actors and anti-tampering mechanisms.

[1]: https://hubs.mozilla.com/


Interesting, never heard of that. I'm testing it and the lack of mouse movement inversion, both x and y, drives me crazy. Did I miss it or they haven't implemented it yet?


Huh do you mean via headset or just using mouse and keyboard? It works fine for me.


Using mouse and keyboard, but I later found that I could adapt to it.


it isn’t clear to me how you could have expected anything but this…

there are far too many hacks, too little moderation, and too little community pushback on bad actors. this will _always_ happen if the communities themselves don’t figure out a way to chase the trash out.

it doesn’t matter if it’s a social network, it doesn’t matter if it’s a game, it doesn’t matter if it’s a forum. if it relies on money, then either the community will make trash unwelcome or the company will, and of course the company will do what’s the most convenient for them.

we talk a lot about we want companies to keep things open for communities to run, then when companies try a hands off approach the communities sit by and do nothing while malicious actors take over lol.

then, strangely, the communities who did nothing to address actual real problems act all shocked.


Good points. There are solutions. A more transparent and community-oriented governance model would help.


Maybe this would have gone over better with the community if they had made the most popular mods redundant before rendering them unusable?

> Finally, we’re aware that many legitimate users install modifications to add features they wish VRChat had natively. We're very aware of the popularity of these modifications, and we’re aware that EAC means those modifications are gone, too. As such, we've been working towards native implementations of features like a main menu that's usable even when you're lying down, a portable mirror that you can use to calibrate your full-body tracking (or provide a face-cam), and more – all planned for upcoming releases.


> Maybe this would have gone over better with the community if they had made the most popular mods redundant before rendering them unusable?

...in what reality do IKv2, OSC, and Avatar Dynamics not count as literally exactly that?

IKv2: IKTweaks

OSC: VRCSTT and other mods that leveraged tweaking avatar parameters

Avatar Dynamics: Cross- avatar interactions


> we've been working towards native implementations of features like a main menu that's usable even when you're lying down

Imagine not having even this yet and then pretending you weren't the one who dropped the ball

I hope this is the beginning of the end of VRChat and I never thought I'd be holding out hope that Zuckerborg's version will fix the mess that it was.


>Maybe this would have gone over better with the community

What community is that, exactly?

Because at least here in 2D land most seem to be accepting censorship and “moderation”



'“Modified clients” are a large problem for VRChat in a variety of ways. Malicious modified clients allow users to attack and harass others, causing a huge amount of moderation issues. Even seemingly non-malicious modifications complicate the support and development of VRChat, and make it impossible for VRChat creators to work within the expected, documented bounds of VRChat.

'In order to prevent that, we’ve implemented Easy Anti Cheat (EAC) into VRChat.'

All this stuff sounds like it should be enforced on the server side, not client side.

'EAC is the industry-leading anti-cheat service. It’s lightweight, effective, and privacy-focused. In short, for any game or platform looking to prevent malicious users from breaking the rules, it’s a powerful solution.'

EAC is the industry-leading DRM ass. It sometimes persists game installations and you need to go to greater lengths to purge the system of its trace. I don't know what 'privacy-focused' even means here.

Honestly this whole post smells of PR.


It's simple, have servers that enforce no-mod rule, and have servers that don't. Even back in cs1.6 there was the concept of VAC enabled and non-VAC enabled servers. Hackers could go in non-VAC servers and hack to their heart's contempt.

You can have official servers that are safe from "harassment", and let people have their own "non-secure" servers they do whatever they want in. It's not that complicated.


Have you played be chat? It actually is a real problem. Anyway, why not try to enforce security on both sides? EAC will literally fix 99% of the malicious issues…


The crashers in public lobbies are real problems, but you should be aware that the vast majority of malicious behavior is not coming from modified clients. It's coming from avatars with specifically programmed shaders which causes others' clients to crash, or purposely malformed package files that cause the client to crash, or sound sources that cause the client to crash. All of those methods to maliciously crash other users are uploaded with VRchat's own SDK in Unity and requires zero 3rd party mods. On the flip side, there were mods specifically designed to prevent crashes by these malicious avatars that can no longer be used due to EAC (FinalIKSanitizer, TrueShaderAntiCrash, etc).


Because it's throwing the baby out with the bathwater. Some of these mods were literally about making the game accessible to people who wouldn't be able to use it otherwise. There's no way they're going to keep up with all useful mods with their official client. Some of their concerns are valid, but at the very least they should provide an option to allow modded clients on your own private instance.

Unless they manage to make VRChat mainstream (which will likely erase a lot of its appeal to plenty of current users and will mean directly competing with Meta), this may be the beginning on its slow end.


>Some of these mods were literally about making the game accessible to people who wouldn't be able to use it otherwise.

How?


There are mods that fix significant performance problems, accessibility mods such as colorblindness improvements, etc.


And lazily screw over 1% of legitimate and interesting creativity by well meaning developers.

This is always a stupid call.

Open source the clients and do the hard work of server side controls to remove abusive behavior.

Imagine forcing everyone to use one proprietary web browser with DRM because webdevs could not be bothered to implement rate limiting and input validation.


I find that number optimistic. They’re still trusting a client that’s in malicious hands, and no anti-cheat will ameliorate that flaw.


You have succinctly described the #1 hard problem in multiplayer games. There is no solution, only things you can do to do better. A good recent example was Fall Guys being basically unplayable due to cheaters and had to up their game with an anti-cheat engine to make it playable again.

The issue is that because there is no perfect solution even if you have a pretty good all server side solution you’ll still improve by adding client side anti-cheat.

Anti-cheat systems are the compromise that allow games to be played on PCs instead of only locked down dedicated hardware.


I can't help but feel like Fall Guys was not the ideal example given just how comically vulnerable it was, it trusted the client for so much instead of running calculations server side you could just tell the server you're at the finish line and disqualify everyone else.


i don't get why so many software companies want to prevent modified clients. The fact that they can detect the client is modified (and kick them out) means they could've gone a different route - allow the instance host to choose whether to allow it or not.

For some instances that are public, and needs rules and clients to obey those rules, it would make sense to ban modded clients. But for some private instances, it doesn't make sense to ban anyone, as it's private and presumably accessible to only trusted parties. The instance creator ought to be allowed control.

In a similar vein, the new minecraft reporting features (https://www.minecraft.net/en-us/article/addressing-player-ch.... ) affects private servers as well, and it's not an opt-in by the server admin.


This!

For example Discord has a splendid unofficial client(Ripcord), but because they don't care about their users they choose to ban people who use it and force them to use the original lag-o-matic mammoth of a client(Like, seriously, an IM that takes up a gig of ram on its own).

Source: https://news.ycombinator.com/item?id=25217170


Their client is terrible. You can't prevent it from snooping on your processes without putting it in a sandbox.


I use web only, block every access to things and customize it with Firefox builtin custom CSS. It will put me in an idle state when I'm not interacting with it compared to a desktop client that knows when I move my mouse.


> I use web only

Which is intentionally garbage to use. I'm with the GP, pop it into an unprivileged LXD container and stop dealing with their corp snooping nonsense.


?? It's entirely identical to the desktop version. Desktop is just an electron wrapper that adds fancy integration like rich presence and in-game overlays and whatever else.


I use both the web and the desktop version depending on what I'm doing and I'd say they are identical except for a few features that obviously cannot work on the web (eg global keybinds for push-to-talk).


lol.

This brings me to a different pet peeve of mine. Discord is really opaque in a way that it handles activity status, it doesn't show you what status it thinks you are (i.e it can show you as Online but Away or Mobile to others). Which is very annoying when working with others.

For times when I work off my iPad, I use a vnc and xdotool to press a button every few seconds on my PC (not in Discord though, don't want to get banned). That is the only way I found to make sure it shows me as Online and not breaks their T&C.


same. but I get the feeling that the era of services like that available in the browser is near its end. 10-20 years, and their website will only have links to 'get our app'


i'd just use the web client for discord.


The why is spam prevention which is why it’s against their ToS to send automated messages as non-bot accounts. How do you (somewhat) enforce that you make everyone use their not scriptable client. Is it perfect, hell no, does it work, yes.

Like I get that this is HN and the hacker ethos is “anything I can do as a human I should be able to automate” but that makes the already intractable spam problem that much harder.


a spammer can automate the spamming via tools like autohotkeys, rather than modding a client. It's slightly more resource intensive, but not by much imho.

Preventing spam is more a task for server-side detection and filtering. Gmail's been doing it very well for decades, and i can't see why similar methods won't work for messaging. And adding a rate-limit server-side is another defence. The least useful option of banning modded clients for chats is a bad trade-off for most users whilst providing very little actual protection.


Stuff like AutoHotKeys is far slower than a multi-threaded bot driving the protocol directly, and spam is all about speed/volume, so in practice if spammers are reduced to that it's a big win.

Gmail actually does rely on various forms of "anti cheat" to control spam, including obfuscated code that detects attempts to automate the browser. That doesn't help for inbound mail but it helps a lot to stop people sending spam from Gmail.


> Stuff like AutoHotKeys is far slower than a multi-threaded bot driving the protocol directly

Not by an order of magnitude, esp, for chat applications that are limited by http application protocols.

> ..that detects attempts to automate the browser.

Nobody sends spam from the gmail web browser client, but suppose google successfully bans (somehow?) all other email clients, the enterprising spam sending people would still easily automate the browser, esp. with things like autohotkeys (which doesn't require modifying the browser itself, nor run code in the browser).

The point is, banning a client side mod for any service is irrational if the justification is to prevent spammers.


I used to work on Gmail anti-spam. Blocking attempts to send spam through the browser UI was most of our job. Driving browsers by things like AHK is (a) detectable and (b) definitely slower by more than an order of magnitude! You can much more easily run many requests in parallel (through proxies) within the RAM budgets of ordinary desktop/laptop computers, when going direct to the network. You don't have to believe me, go read spammer forums to see people discuss how much better multi-threaded bots are.

"the enterprising spam sending people would still easily automate the browser, esp. with things like autohotkeys"

It's not at all easy for them, and people who try approaches as simple as that don't make any money. Or didn't when I worked on it at least.

"The point is, banning a client side mod for any service is irrational if the justification is to prevent spammers."

That belief is widespread, but I built a part of my career on proving it wrong. We got unexpectedly excellent results from cracking down on alternative clients that were pretending to be browsers.


The point of third party clients is not even to expand on any functionality, but rather to strip down unnecessary bloat of the official client.

Users still get banned for that, even when they are well within reasonable use, and then they are unbanned with words “This serves you right, now use our spyware* of a client”.

* Discord used to scan your entire HDD “for games” when starting up.


> In a similar vein, the new minecraft reporting features (https://www.minecraft.net/en-us/article/addressing-player-ch.... ) affects private servers as well, and it's not an opt-in by the server admin.

What does this mean? Do users not connect directly to private servers? Does every minecraft client have to authenticate with a central server before joining a private one?

Or is it just an on-by-default "use official banlist" option?


Yes, you do need to authenticate before joining a privately hosted server, unless you change 'online-mode' in server.properties in the server folder. This came with a few issues. Offline servers usually had a plugin which let you create a password connected to your name in-game, since anyone in theory could just change their name to yours and get your inventory and permissions, including OP. AuthMe is an example of an authentication plugin, if you're interested in checking it out.

So central authentication has been a part of the game since the beginning of multiplayer more or less, but Microsoft has taken it quite a bit further with the recent changes. Quite a shame to potentially be banned from a private server I host myself, whitelisted only for IRL friends. How strict the criteria to actually get banned is irrelevant in my opinion, I bought the game over ten years ago, and now suddenly risk losing access to multiplayer, which was the only reason I bought the game. But we'll have to see how it actually gets implemented, I really hope it's opt-in, or at the very least possible to opt out for privately hosted servers, but I have no issue with it being implemented on Realms.


> Does every minecraft client have to authenticate with a central server before joining a private one?

yes - login is required by the user iirc. You need a microsoft account to play minecraft (and i believe they recently migrated all old mojang accounts to microsoft now?)


It used to be "Migrate voluntarily and get a free cape, we will never force you to migrate"

So much for that.


You'll also need to provide a phone number, with the usual BS of the requirement appearing after you've migrated suddenly for "security reasons". They know people would bail on the conversion if they made this requirement so instead they take your account away and hold it ransom until you cough up the goods.


Oh, this is common practice? I noticed the account was suspended a few days after creation for "ToS violation" - requiring phone verification to reactivate - and have so far spent hours of supports and my time hunting down the why.


Yes. They are forcing them all to migrate to Ms accounts.


> What does this mean? Do users not connect directly to private servers? Does every minecraft client have to authenticate with a central server before joining a private one?

Minecraft servers in the default "online mode" authenticate users against Mojang's "session server". The clients get a token when they log in to their launcher and the servers then validate that token when the user logs in. This verifies both that the user is who they're claiming to be and has a legitimate copy of Minecraft.

Server admins can turn off online mode, but this means that anyone can claim to be anyone else unless some independent authentication scheme is implemented. As a result it's usually only done for actual offline servers and those that want to allow pirate copies to play.

I'm not super familiar with the current controversy but the understanding I have is that chat reports could potentially result in being unable to log in to online mode servers which seems likely to have overreaching effects.


I haven’t looked into it because it seemed straight forward. But now you have me curious.

The clients always authenticated before launching the game and required a valid session to join even private servers. I’ve assumed the change just disables the option to join any games similar to if they weren’t properly authenticated.


> Does every minecraft client have to authenticate with a central server before joining a private one?

Yes. The way it works is that when a client attempts to join a (private) server, it sends the user id to the private server, and then sends Mojang a message informing them they it's trying to connect to a server at this address.

The private server then asks Mojang if the supplied user id is valid, and attempting to sign in to the server at this address. Mojang then replies to the server confirming the authentication of the client, and then the server lets them in.

There's also an exchange of cryptographic keys mixed in there, but that's the basics.


they all have to use the central auth server, which does the banning you can turn off using the central auth server but it comes with downsides (no skins, no way to protect people from using your username without a separate auth plugin)


Runelite takes Runescape to the next level


And it is fully approved by Jagex!


>i don't get why so many software companies want to prevent modified clients

Perhaps not the most crucial reason in non-competitive gaming like VR Chat, but generally many companies want to have full control of their interface. It often totally makes business sense, but comes at a cost to the users.

I worry about the day reddit closes their APIs the same way Twitter did.


It is also a security issue for users apparently, such as client mods that are also keyloggers and their accounts getting stolen.


VRchat actively borrowed ideas from mods when the base game was barren of features, then turned around and vilified them - these mods are created by other game devs. Features such as global dynamic bones interactions were ripe for a software patent. Yet, mod authors open sourced it.

Tens of thousands of hours have been spent here, and it actively increased user count and value of VRchat. Yet Tupper (ceo), not only gives nothing in return, but actively takes the work and shuts down the original.

They even banned other game devs that made good mods that were commonly known. But they didn't ban any malicious authors because they were unknown. (Duh)

They have made agreements with mod authors to not interfere with premium features, and they obliged even.

What even is there to "anti cheat" in a social game where you can do anything? This isn't an fps game, no goal, no "game". That logic is implimented per world by world creators.

Just completely disrespectful and slimy. Anything this team touches is tainted as far as I'm concerned. You don't make your product and paycheck off others work and then slap them with it. Especially not to your fellow industry.

Tbh if I were Valve id be enticed to delist this game for that behavior. It against what the VR ecosystem is for.


Agree with most of your stuff, but just want to reply to your last sentence.

Valve doesn't care about VR. Last week Steam VR Fest is a proof.

Games that literally have basic functionality broken and are on the way to be completely shut off (due to servers being unavailable soon) were still discounted and promoted on page 1. While new indie games doing innovative stuff were left to rot on page 60. Nothing changed in the algorithm compared to previous year, promoting the same set of games once again.

On top of that, SteamVR unity plugin that arguably at some point powered most of VR games haven't been updated in years with any new functionality.


I have a separate Windows machine I use for all of my gaming, including VRChat, that does not touch my Linux network or any of its files. I started doing this around the time Windows 10 came about because I didn't want to take a gamble with what any of these "security" implementations would do. I tried VRChat because it's free and works with and without a VR headset.

Modded clients aside, it's fairly easy in VRChat to have an avatar with too many performance hitting effects that can outright crash other players on lesser hardware in PC worlds. I've heard anecdotes of the Quest version, while receiving recent improvements, is also prone to crashing on mixed PC/Quest worlds. To be fair, the default view settings are aggressive enough to cull the more offensive avatars (for both performance and moral objections ) back to a safe default and it can be adjusted to taste. Avatars are also graded by their performance impact and you can see a full spec sheet of their polygon counts and various performance metrics right within the game.

They do a decent enough job providing good defaults and the content is gated in a way you can individually whitelist people and specific avatars/worlds as you experience them. There is a global "Trust Tier" system that allows you to move up the ranks by racking up play time and friend count, new players are tagged as visitors for a long enough time that it's not too hard to weed out bad actors. I don't believe I've encountered any modded clients personally but I will be interested to see what kind of effect this has in the in-game world.


> I don't believe I've encountered any modded clients personally but I will be interested to see what kind of effect this has in the in-game world.

Wow, I only played for a few hours and it seems like every lobby had someone trying and succeeding to crash people.


You can crash people quite easily in game without any mods by just loading an overly-complex avatar. There is the content gating system, but quite a few people just turn it off, hence the crashing.


Semi related: I don’t use VRChat, but EAC is the worst piece of anti cheat software I’ve been subjected to.

It’s opaque, unhelpful and constantly crashes my computer. There’s a whole stack of stuff I can’t play with friends, but EAC politely sends my computer into a boot-loop every time I try to play something that uses it. Apparently it’s my choice of hardware, because even clean windows installs aren’t enough to avoid its ire. And of course it won’t tell you, and epics help amounts to “uuhhh update your bios lol?”.

At least Valorants anti-cheat has the dignity to tell you what it doesn’t like, and how to resolve it.

So good luck to all the people who will effectively lose access to vr chat because of epics awful piece of software.


Client-side anti-cheat has always been anti-consumer. It's more of a invasive DRM.


There exist different levels of invasiveness. I've had issues with EAC multiple times, forcing me to reconfigure my system just so it'd let me play games. Something like VAC might not be as good at catching cheats as EAC, but it rarely causes problems and works great on a wide variety of systems and setups.


> EAC is the worst piece of anti cheat software I’ve been subjected to.

I don't understand how this could be the case, TFA says it's industry-leading

/s


It's another case of where "security" means keeping your computer secure from you. And the password stealer argument is just stupid: by the time EAC kicks in, your password is already stolen.


This is only partially true. Once the knowledge that unofficial clients won’t work gets out people won’t go looking for them.


Unofficial malicious clients will still work, only the safe open source ones won't. Dodgy EAC bypasses seem to be easy to find, and will likely put people who try to use them at real risk.


It's actually really hard to distribute a cheat to a lot of people without it getting flagged.


Yeah, let me clarify my argument there. Open source, safe, well-known clients, mods or modding systems can easily be monitored for workarounds and flagged, and become unviable. Malicious actors with second intentions however can keep finding new exploits and releasing small, obscure, harmful software that has way less exposure because the vast majority of people are not interested in harming anyone, they just want their mods to do something minor like displaying screenshots in the loading screen or adding support for eye tracking. They will not seek these out. These will be sold (financial incentive) or used by people with too much time on their hands and an axe to grind.


How so?


They are saying if you download an application (e.g. vrchat.exe) from the internet than it can steal your passwords as soon as you launch it.


That's not what VRchat's blog post say at all, though.

"Every month, thousands of users have their accounts stolen, often due to running a modified client that is silently logging their keystrokes as well as other information. These users – often without even realizing it! – run the risk of losing their account, or having their computers become part of a larger botnet."

People running a backdoor client and getting their passwords stolen? That doesn't sound... wrong?


The wrong part is that somehow EAC will help prevent that. It won't.


It would prevent it in a way that you know that the mod you are downloading is not likely to work and you wouldn't download it in the first place.


Not really, some mods will still work


The chance of some application downloaded for free on the internet will bypass EAC is minuscule


You can do whatever you want with your computer, just don't try to connect to their proprietary servers.


But now you're into the Service-as-a-Software-Substitute problem. I can't usefully use the VRChat client without doing that.


I agree but I was replying to your initial point about locking down our computers. I think the downvoters don't get this.

I think there's a case to be made that VRChat allowing mods in a safe way is possible, if they wanted to do it. But they clearly don't see value in that. I hope they're proven wrong soon.


The vast majority of Second Life users have been using primarily third-party viewers since Linden Lab open sourced it in 2007. It's been a boon for users. It just takes a bit of up-front design and planning. And Linden Lab is actually profitable. Profitable enough to dedicate significant resources to developing a new platform (Sansar), discard it, and still be profitable.


> “Modified clients” are a large problem for VRChat in a variety of ways. Malicious modified clients allow users to attack and harass others, causing a huge amount of moderation issues.

Imagine if non-game companies were this open about their servers being exploitable


I don't know why you got downvoted. It's really odd that they don't enforce this stuff server-side.


why do you think no multiplayer video games have tried that?


Because... they do...?

Some things are legitimately much easier to restrict client side e.g. for FPS: you can send other-user data all (or much) of the time and not have to worry about if they're looking in that direction, or if there's line-of-sight through the hole they just shot in a wall. That kind of thing takes a lot of server-side compute, and unless it's perfect it's still a competitive advantage.

But when it's not competitive? And they're basically just blocking abusive clients? Heck no, they do server side restrictions. And they often pay the price badly when they do not.


Huh? Almost all of them do. Try logging into WoW, send a packet that says you picked up a million gold, and observe what happens to your account.


Since VRChat isn’t open-source and runs the servers they can do whatever they want. They could add in required ads and telemetry. They could integrate blockchain. And they can do this five years from now when they are much more popular and much larger communities are tied to them.

If you’re upset that they’re banning modding and adding an invasive anti-cheat, it’s a good time to look for alternative VR platforms or start creating your own. This is one of those cases where "open-source" means something. And while the meta verse isn’t small or new anymore, it’s still experimental buggy stuff, so a small team working on their free time may actually have the potential to compete.


> telemetry

I believe they do collect a ton of data on how the user interacts with instances already.

> it’s still experimental buggy stuff

Kind of. They did have several years to build up a community and find a content creation pipeline that kind of works for a lot of people. And many millions of VC.

You're right, though, ideally the "metaverse" should run on open source software.


VRChat was the first time since the 2000s that I felt the “magic” of the old internet. It had the promise of quirky, unique, and individual (not “personalized”) experiences. It came with the ugly parts too, and a little risk, and it was kind of beautiful. Sad to see the direction they’re taking. I guess easy product placement is more important than community.


I feel the sentiment in my soul. Both the awe and wonder at the sense of adventure and unbridled creativity of it all, and the mourning at the knowledge that this is the beginning of the end.


The ugly parts have taken over. The Second Life users have moved in and now dominate many spaces, along with tons of kids ("Questies") and the usual always-online contingent of furries, extremists of various sorts, and adults being creepy around young kids.

There's still cool stuff there and unique and positive interactions to be had, but it's a slog.


We Second Lifers moved in over four years ago, long before anything went viral, and long before any of the people doing the complaining. We've been here from the start.


You're sort of proving my point here.

"Second Life Is Plagued by Security Flaws, Ex-Employee Says; A former infosec director at Linden Lab alleges the company mishandled user data and turned a blind eye to simulated sex acts involving children."

https://www.wired.com/story/second-life-plagued-security-fla...

This community is now deeply imbedded in VRChat, another platform full of unsupervised children and security flaws.


Did the security flaws in Second Life tag along with the user's exodus and make VRChat catch them? How does that work?


A certain type of community is drawn to a particular kind of platform.

In this case, an exploitable platform filled with children and near-zero moderation of sexualized content. As you know, VRChat doesn't even have an NSFW tagging system or option for users to filter such content out (without hiding all custom avatars, i.e. disabling VRChat's core functionality).


The lack of an NSFW content filter is mind-boggling. They have offered an "NSFW" checkbox when uploading avatars for years, but still to this day have given users no way to filter out avatars tagged "NSFW" in game. They could solve most of this problem fairly easily by blocking users from wearing avatars tagged "NSFW" in public instances. NSFW content doesn't belong in public instances and indeed it's a bannable offense already - so why they have not made this a priority is a mystery. Guess the developers are too focused on implementing EAC.


human nature unhinged from physical constraints. for some reason despite growing up playing and hacking MMORPGs this never really dawned on me..


VRChat doesn't reflect human nature overall, just those who fail adapt to the mainstream and hide out online 24/7. High overlap with Second Life, 4channers, and heavy Discord users, less of an overlap with people who play competitive games or contribute to society.


I very much doubt this point. My reference is political comments on Facebook: these beautiful people, who otherwise do pay taxes, have kids and maybe even a picket fence, can spew hate and nonsense like there's no tomorrow, and they do so with their face attached. So while VRChat might not be a statistical representative for the population of the particular things you can encounter there, it really is just human nature and if everyone would be there, you'd see a similar level of unacceptable behavior, just maybe a different of that.

What I think happens in places like this is similar to what happens in the crypto world. Money and financial situations can be abused hard, and it has been for a long time, so we have a good amount of rules, laws, international agreements and so on regarding it. Then crypto came, it kinda resembled money, and had none of these restrictions. So every trick is new again: market manipulations, all the different kind of frauds, all unprotected from the existing legal framework that we have for regular money. I think online communications enjoy a similar situation, or at least it did, for quite some time.


The slice of humanity on VRChat is significantly more neurotic and hypersexual than the average Gen-Xer ranting on Facebook. They’re also more OK being inappropriate around young children, the other large VRChat constituency. It’s worse in VRChat than in any other VR game or online multiplayer experience I’ve had (and I’ve had a lot) except maybe Second Life.

The Facebook crowd would be just as bad in VR, probably, but VRChat is its own sort of hive of scum and villainy.


Exactly, all thes useless people are wasting their life. Imagine how much they could accomplish if they joined the Russian military and successfully conquered Ukraine, they would be heroes.


The cynic in me thinks that their real plan is just to add monetisation and or implement/sell features that already exist in modified clients and that "security" is a convenient excuse.


They already do that: increased number of avatar favorites.

emmVRC, a hugely popular legitimate mod (referred to as a "wholesome" mod by the community), used to have this feature, but removed it an implored their users to support VRChat instead if they wanted it. Other more gray-area mods still have it.


This is a real shame. I met many interesting people on VR chat. Some were exploring identities that only a virtual world would allow them to. Mods enabled these people to lose themselves in the fantasy of their identity. I heard sad stories about how VRchat was where they could be themselves.

They're going to lose that opportunity now. For what? Tighter control so VRchat can engineer their own knock off mods? Greedy. Sad.


Wouldn't want people cheating in our highly competitive virtual hangout video game.

At least it still works on Linux.


> At least it still works on Linux.

But Linux probably no longer works on VRChat

https://www.youtube.com/watch?v=G2u7NOpzcBQ&t=5052s


Wow, that's amazing! Thanks for the link. Makes me want to dust off the old Quest 2 and play some VRChat to meet some nerds :)


This still works. This wasn't a disallowed mod, this was part of a custom world made for VRChat.


"Cheating", IE sideloading anything useful or improving beyond the base app? Sounds like they want to compete with Second Life for imaginary scarcity.


I'm aware of several communities which seem to rely heavily on custom mods. Not sure what they are going to do, though on the plus side maybe we will see more innovation in the multiplayer VR marketplace.


I agree. I was being facetious.


Possibly part of aggressive anti-NFT movement?


No, EAC breaks opening the VR version in Linux. Only the desktop version works. They are mainly interested in supporting the steamdeck so who knows what priority this issue will receive especially since it's very likely an EAC issue and not a VRChat one.


Well, this is a disaster then. I was excited to get back to VRChat once I got a Linux compatible headset, but I guess that's not possible anymore. What a terrible change.


I'm not a game programmer. Why do games like VRChat and Fortnite need to use client-side mod detection at all?

If everything goes through their servers, it seems like a huge waste of time. If it's peer-to-peer it makes sense.

I'm trying to google it and there seems to be a lot of semi-informed lore, some people say it's centralized and some people say parts are p2p.


Fortnite definitely not p2p (my guess), can't talk about vrchat but it makes sense that parts could be p2p. Lots of users and they would want to minimize server trafic

People would want client side checks for variety of reasons. It is more performant (for your server), it is more robust (imagine you are not sending player data that are behind a wall. Now that player might be visible very quickly by moving but your server response might be too late) and some stuff is almost impossible (aim bots etc. Sure your server make some guesses but a client detecting existence of such programs is still a plus)


VRChat is jam-packed with malicious users who use modded clients or run Wireshark to dox and harass people using their IP. Imagine an adult telling a little kid in VR, "I know where you live, it's in X city" and then social-engineering their way to even worse places. They did a big refactor a few years ago to remove all/most P2P networking due to the DDoS and harassment problems. The VRChat community is incredibly toxic.


VRChat exposes user IPs? edit: another comment mentions "partial p2p" (I assume to share user models etc? maybe more) which sounds plausible


You can still use Wireshark even with anti-cheat.


What little kid is going to own multi-hundred dollar VR systems.


The same kids whose parents bought them an XBox or PlayStation. The Quest 2 is cheaper than both at $300. It’s standalone so you don’t need a PC. There’s been a sea change in the VR player base since the first Quest came out:

https://youtu.be/wbB807o-Fjs


A Quest 2 is $300 and so so many kids have them.


Same little kid that can have a computer, a tablet or a console. Or a house while at it. It is their parents that buy/own VR systems, salary of the kid does not matter.


Here's a simple reason for why client-side cheat detection is necessary:

Suppose we have a competitive game where the goal is to click on a target quickly. Most online shooters have this property.

A cheat can always do this at least as fast as the best players in the world can.

Without client-side cheat detection, you will not be able to detect, on the server side, whether or not the commands sent from the client are coming from a cheat, of a world champion player.

Yes, client-side detection isn't perfect. But in security, perfect is unattainable.


Some good memes about this on the VRChat subreddit, which was locked to new submission fifteen minutes ago in response to the overwhelming community outrage:

https://www.reddit.com/r/VRchat/


Guess what what will happen? Hackers will circumvent EAC. And modders will go extinct. How to kill your game, speedrun %any.


This is why we need free (as in freedom) software. VRChat barely even seems like a game, so I'm surprised there's not a more popular libre version by now. I recall there are one or two options out there (can't recall the names), but I have yet to get into VR myself, largely because so much stuff for it is proprietary.


Aw. This is a real shame. Back when I played I liked dressing up in a little magician outfit and flying around / teleporting (the netcode takes the absolute position of the player from the client).

At the time the client was Unity + Mono (haven't checked if it still is) so you could just add a DLL with your own code that gets loaded, as well as ahead-of-time patching Assembly-CSharp to introduce your own hooks. Super simple & really mod friendly.


what are the implications of this for users? are there ubiquitous QOL mods that a lot of people run on VR chat?


No one will admit it, because of the company's insane attitude towards modding and the heavy handed bans that result from it, plus a policy of preventing actual discussion about said bans, but yes, that is very much the case.


Let's see how long it takes them to add microtransactions/NFTs now that they locked it down. My guess is <1 year.


One of their competitors, Neos, already has some kind of built-in cryptocurrency:

https://neos.com/#neos-credits

The selling point of Neos used to be that it allowed 10-point full-body tracking (as opposed to 6-point) while VRChat did not, but VRChat recently added that functionality.


I mean yes but its not the only selling point of Neos. In the difference you have stuff like:

- real time world creation and collaboration while being in game (no external unity editor) including a node scripting system

- user hosting session (instead of full server hosting) at the cost of the user

- inventory system from which you can save/spawn from any world (this actually have a limit in size, 1Gb for the free account)

- mods are well more accepted (maybe for now because it could change like we see with vrc) and more that i forgot.

Note: I'm a regular Neos user, I'm not trying to sell the game there is stuff who are annoying (like the game is not really optimized compared to vrc).

And for the subject of crypto the community is actually really divided about it while is not really useful for now...


Another selling point of Neos is its ability to create full-featured games and worlds from within VR, as opposed to need to code them in Unity to do anything custom for VRChat (and it's getting banned, from what I understand?)


I thought VRChat was all about hacking things together and tinkering with VR, from my impression of the people I've met in there.

They're going to lose a big portion of their clientele


That was five years ago. Now it's about screaming kids, adults grooming children, and sitting in front of mirrors in hypersexualized avatars.


party on....


No, you're confusing VRChat with Neos VR.


VRChat mods include things like general QOL improvements, but also most accessibility features are mods. Things like preventing flashing colors to avoid seizures.

They want to ban them because they compete with VRC+, their premium service.


At least one mute person was banned from their Discord for saying they need to be able to use muteTTS to talk to other players, which sends... quite the hostile message to players with disabilities. VRChat has practically no accessibility features beyond their fairly basic rank-based privacy settings (and of course there's a mod to enhance those considerably).


Well, that jumped the shark quickly


They consider archiving avatars to be malicious? Wonderful.


Tangentially to EAC, I've commented downthread several times on the danger to children and lax moderation of VRChat, but just to drive the point home, search "VRCat" on PornHub and you'll find all sorts of videos created in-game. These are avatars on VRChat's servers that users can and sometimes do use in public worlds. Officially VRChat's Terms of Service prohibits NSFW content. In practice, the game's loading screen tips say "no NSFW content in public worlds" and staff know exactly what's going on.

There is no NSFW tagging system, no content filtering for uploaded avatars, no effective means to permanently ban abusive users, and extremely lax in-game moderation. There are tons of sexualized, fully nude, and fully-functional avatars all over the place. Spend a couple hours in the main public worlds and you're almost guaranteed to see some. Remember, this game is absolutely full of children and pre-teens, and the kind of adults who choose to hang out around them. The game is free on the Meta Quest store with zero age restrictions.


In my experience, and from what I asked a few friends about: We have never seen an actual "NSFW" avatar in our combined hundreds of hours of playtime, and those that use the avatars that comes close to being NSFW are usually the younger audience. It is possible to filter out avatars from the world space, and only be able to see your friends avatars. Making your own instances (servers), you are able to kick out players that misbehave.

But I would like a NSFW-tag on content.

And as a side-note: The game is free everywhere, Steam, too.


As someone who makes quite a few avatars for myself and friends, there is a flag for NSFW when publishing your avatars to VRChat's servers as well as ones for realistic violence, gore and such. I'm unaware of what impact these flags have for the use of said avatars though.

I have tested a fully SFW avatar with the NSFW flag checked in the past and I was able to use it just fine inside private and public lobbies so what we actually need here is a way for either world creators and/or lobby visibility settings to restrict what avatar flags are allowed within the current session of that lobby, as well as the ability for users to auto hide/show avatars with specific flags checked.

This wont be bullet proof as this relies heavily on user based trust (which VRChat have a long history of being against) and someone can very easily reverse what I did and mark a heavily NSFW avatar as SFWto bypass such filters, but it will be a good start for limiting exposure to these avatars.


I am just a regular (free-)user in VRChat, and have never seen a "NSFW"-tag or any kind of rating on avatars, so I thought there were no such thing. I wonder how it is used then.


I would contest that in this sense VRChat is just a VR version of Omegle in this sense. If a parent lets their young children onto a VR headset unsupervised, well they probably would also let them onto Omegle unsupervised. In both cases I'm inclined to think the parent is somewhat negligent in letting their kids talk to random strangers on the internet.

To go even further, it seems the same issues occur on Twitter, Instagram, and TikTok, and those sites don't have to install kernel level anti-cheat to ensure you don't look at illicit content. At the same time, I would argue that this shows that kid-unfriendliness is an internet-wide problem and maybe we should be doing more in general to build an internet that works for kids.


Is that really related to modding though? I thought model swapping was built-into the game, people can change their look without any mods. How does banning mods solve that?


It doesn’t. I’m talking about a problem that’s inherent to the base game, and unfortunately not widely known.


This sort of just sounds like the internet at large.


Yes, to an extent it’s more of the same, just more immersive. If you’re an adult, the bad stuff is white noise for the most part.

If you’re a kid, it can be scary and potentially dangerous. The game is advertised with cartoon characters and a child-friendly aesthetic, far from what you actually find in public worlds. It’s easy for bad actors to hone in on kids based on voice, then corner them and do bad things. I’ve spoken to kids who were approached and “flashed” by much older users with explicit, ejaculating sexual avatars. It’s unpleasant and disorienting. Not to mention adults who zero in on kids, drop a portal to show them a “cool world” and then get them one-on-one in essentially an unsupervised VoIP call. This platform is practically designed to trap and abuse kids.

And yes, kids technically aren’t allowed to play VRChat or even go online using the Quest per Terms of Service. Enforcement is impossible, not that anyone tries, so kids are everywhere.


If kids are banned from the service then people shouldn't bother trying to protect them on the service. That's the responsibility of their parents not the owners of the service.


When the service is full of kids anyway and the operator knows this, the situation changes. Also many adults (me, other users I’ve spoken with in-game) would like the option to filter NSFW content, as on practically every other large Internet platform.


Bible Black and South Park are technically "cartoon characters", that doesn't mean acceptable for kids.

Please don't let your child talk to strangers - in person, in Roblox or VRChat, in sketchy forums, etc. If you do, set boundaries for them and have the talk about danger.

If you let your kid talk to strangers, strangers will talk to them.


The community's reaction has been documented in a video here, and it's not good: https://www.youtube.com/watch?v=L6g11f0FGPo


I hope steam proton support will still be available. Have been running it with proton and besides not playing videos (known issue) everything else is good.


Correction: It seems they acknowledged the steam proton scenario and they claim EAC works fine.


It's trivial to bypass EAC though..


It’s trivial to bypass, not sure if it’s that trivial to evade getting caught though.


Getting caught just closes the game. They aren't banning modders.


As with all DRM, it will pose no real obstacle to abusers will inconveniencing regular users (in this case, people using quality-of-life mods).


Yeah, still fine with loading a modified kernel32.dll as long as it is EV signed for example.


it's relatively simple to bypass EAC when you're just a single user since it just ignores all of the blatant signals you're giving off, it just doesn't scale well


Eventually those signals can accumulate to some threshold and they will ban single users. They do improve their anti cheat pretty regularly and a cheat that only provided 1 or 2 suspicious signals (e.g. rxw memory in the kernal space that isn't backed by some valid module) might eventually provide 3 or 4 as they improve their anti cheat (maybe they start looking in some obscure windows table to find evidence that you loaded a bad driver in the past or they stackwalk your kernel thread to find it executing in unbacked memory).


> (e.g. rxw memory in the kernal space that isn't backed by some valid module)

They do flag you for this but not actionable on its own as PatchGuard demonstrates the exact same kind of behavior. RWX memory in Kernel Space that's not associated with a signed module.

As you also surely know, there is also a lot of rwx memory that's not actually utilized yet allocated by many drivers that you can deploy your shellcode into.


> They do flag you for this but not actionable on its own as PatchGuard demonstrates the exact same kind of behavior. RWX memory in Kernel Space that's not associated with a signed module.

I'm sure most cheats use drivers that are pretty similar to each other (e.g. will have the same imports that show up as plaintext I think) so they can look for that. And PG threads/pages have plenty of identifiers.

> As you also surely know, there is also a lot of rwx memory that's not actually utilized yet allocated by many drivers that you can deploy your shellcode into.

Usually most cheats will use this shellcode to still jump back into the larger suspicious RWX memory region. I guess discardable sections might be large enough to hold an entire driver. I haven't really been following the latest developments in this field≥


Pretty much all EAC games that are worthy targets have paid hacks/cheats/botting suites that are widely used and not detected. I would guess that it depends on the game's vendor and most don't care enough to actually improve the detections and just use it as a "we tried" excuse.


EV certs are 100s of dollars though...


You can add your own "EV" certs to the Windows certificate store very easily.


Most of these check the certificate chain, and I've seen a few also upload fingerprints of certs that are rare or not seen before.


The VRchat experience has become horrifying, to say the least. Glad they are working on it.


It seems like the problems they say they are solving should have been solved server side rather than in the client.


the East Australian Current? Gimme some shell bro!


Exact Audio Copy?


Easy Anti-Cheat, https://www.easy.ac




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: