this post was submitted on 06 Dec 2023
55 points (100.0% liked)

Gaming

30579 readers
174 users here now

From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!

Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.

See also Gaming's sister community Tabletop Gaming.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

So yeah, I want to discuss or point out why I think Valve needs to fix Anti-Cheat issues. They have VAC but apparently its doing jackshit, be it Counter Strike 2 (any previous iterations) or something like Hunt: Showdown the prevalence of cheating players is non deniable. For me personally it has come to a point that I am not enjoying playing those games anymore, although they are great games by itself. But the amount of occurrences being killed or playing against cheaters is at a height, where I don't see the point anymore.

  • Why I think Valve is the only company able to something against cheaters?

Because they have the tools with VAC already aiming to prevent cheaters. Valve has got the resources to actually invest into something more profound which could be used for any game where anti-cheat protection needs to be implemented. And lastly Valve is the company which is interested in furthering the ability to gaming on Linux, the anti-cheat solution needs to work on both operating systems. Only Valve has the motivation and means to achieve that with their knowledge and resources. What do you guys think about the topic? Is the fight against cheaters hopeless? Do you think some other entity should provide anti-cheat protection, why? I skimmed over "anti cheat in linux kernel" posts in the net, but I have very little knowledge about the topic, what is your stance on it?

Edited: Mixed EAC with VAC. EAC seems to be part of Epic Company. Both of these tools seem unable to prevent cheating like mentioned above.

top 50 comments
sorted by: hot top controversial new old
[–] savvywolf@pawb.social 101 points 1 year ago (5 children)

Screw client side anti-cheat, fix your goddamn server code.

I'm reminded of a case in Apex Legends where cheaters started dual wielding pistols, despite dual wielding not actually being a game mechanic. That should be something you can easily detect on your server and block.

Client side anticheat is just smoke and mirrors and lets developers think they can get away with not doing their job of writing secure code.

I'm honestly surprised that with all this concern about privacy against Google, Microsoft, Epic, and so on, gamers are willing to just let these games have unrestricted and unchecked access to all your internet, microphone and camera data.

Likewise, despite how much gamers call games "broken glitchy messes", they are perfectly willing to give them enough hardware access to literally destroy your computer.

[–] brsrklf@jlai.lu 37 points 1 year ago* (last edited 1 year ago) (1 children)

Yeah, I agree with that. Installing freaking rootkits on people's personal device, with the express purpose of identifying them and knowing what their machine contains, is not OK. A multiplayer client should be as lightweight as possible and shouldn't be able to fuck with a game.

Even if they agree not using your data for anything else, the next security breach on their servers will make that promise useless.

And I am not sure why one would trust big publishers to have any kind of ethics anyway. Do you remember Activision's patent to manipulate matchmaking? That would specifically match players to reward those who buy microtransactions and create pressure on those who don't?

Yeah, totally trusting those manipulative snakes with my private data with a big "do not watch" sticker on it.

[–] tal@lemmy.today 12 points 11 months ago (1 children)

Installing freaking rootkits on people's personal devices

If Valve is gonna do anything, I'd rather have them sandbox games from screwing with the environment, not the opposite. I'd like to be able to install random mods from Steam Workshop without worrying about whether some random modder might have malware attached to their mod that can compromise the whole system. I don't care if a malicious mod dicks up the save games for a particular game, but I'd rather know that it cannot go beyond that.

That doesn't solve the cheating problem, of course, but it's a case where anti-cheating efforts and security concerns are kind of at odds.

load more comments (1 replies)
[–] lemmyvore@feddit.nl 26 points 11 months ago* (last edited 11 months ago) (1 children)

Hear, hear.

Quick disclaimer, I've been involved with FOSS shooters for something like 20 years now. I mention that to establish where I come from: in a FOSS game anybody can modify the game client all they want, so all the bullshit is out of the way from the start. You can't hide behind make-believe notions such as "they can't modify the client" – which is one of the major lies and fallacies of commercial close-source games. If there's something you don't want the client to know or do, you make it so on the server.

There is a lot of things that the server can do that can severely limit cheater shenanigans. If you don't want them to see through walls then don't tell them what's behind walls. If you don't want them to know what's behind them then don't tell them what's outside their cone of view. If you don't want them to teleport look where they were a moment ago and where they claim to be now and figure out if it should be possible. You get the idea.

Aimbots can be detected because at the core it's a simple issue of the client's aim snapping from one place to the target too fast. What's "too fast" and the pattern of the movement can be up for debate but it can definitely be detected and analysed and reviewed in many ways – regular code, AI, and human replay.

If this kind of analysis is too much for your server to perform in real time (it was too much, 20 years ago) then you can store it and analyse it offline or replay it for human reviewers. You can fast-parse game data for telltale signs, analyse specific episodes in detail, and decide to ban players. Yes it happens after the game was ruined but at least it happens.

There are a couple of types of cheating that you can't detect server side:

  • Modifications to the client HUD that help the player grok information faster and better. This is a large category that can include things like colorblindness overlays, font changes, UI changes, movement tracking on display etc. As far as I'm concerned that falls under HUD modding and should be welcome in any healthy game. Again, if you don't want clients to have a piece of information don't give it to them, and design your game in a way that such mods are mostly irrelevant.
  • Automating input. Again a large category that includes macros that speed up complex chains of operations. Can be slowed down by imposing server-side delays but you can hurt legit fast players this way too. Same as above, if this is what makes or breaks your shooter then perhaps you should rethink it.

Some of the most fun games I've seen did not care about HUD mods and macros and on the contrary embraced them. You want to write a macro that will auto-purchase the best gear based on your available coin after respawn? Knock yourself out, because what constitutes "best" gear changes depending of the circumstances, and a veteran with a pistol can smoke your ass anyway if you don't know how to properly use that fancy plasma gun.

I've mentioned human review above which brings up an interesting feature that I don't see implemented in enough games: saving and replaying game metadata. It's stupidly simple to store everything that happened during a match on the server side and it doesn't take much space. You can offer that recording to seasoned players to replay on their PC which allows them to see the match from any player's point of view. An experienced veteran can notice all kinds of shenanigans this way – and it's also an excellent e-sport and machinima feature that enables commentary, editing, tutorials and so on.

Edit: Oh, forgot one thing. You may be wondering, then why don't the big game companies do all this? Simple, cost. Why should they pay for server juice and staff to review games properly when they can slap a rootkit on your computer and use your resources?

[–] tal@lemmy.today 4 points 11 months ago* (last edited 11 months ago) (1 children)

in a FOSS game anybody can modify the game client all they want, so all the bullshit is out of the way from the start. You can't hide behind make-believe notions such as "they can't modify the client" – which is one of the major lies and fallacies of commercial close-source games.

Sometimes, just for practical performance reasons, with realtime games, the client is gonna need access to data that would permit one to cheat. You can't do some game genres very well while keeping things on the server.

Consoles solve this by not letting you modify your computer. I think that if someone is set on playing a competitive game, that's probably the best route, as unenthusiastic as I am about closed systems. The console is just better-aimed at providing a level playing field. Same hardware, same performance, same input devices, can't modify the environment.

'Course, with single player games, all that goes out the window. If I want to modify the game however I want, I should be able to do so, as it doesn't hurt anyone else. I should be able to have macros or run an FPS in wireframe mode or whatever.

For PC competitive multiplayer, in theory, you could have some kind of trusted component for PCs (a "gaming card" or something) that has some memory and compute capability and stores the stuff that the host can't see. The host could put information that the untrusted code running on the host can't see on the card. It also lets anti-cheat code run on the card in a trusted environment with high-bandwidth and low-latency access to the host, so you can get, for example, mouse motion data at the host sampling rate for analysis. That'd be a partial solution.

[–] justJanne@startrek.website 3 points 11 months ago (1 children)

Sure, it'd be a solution for five minutes until someone delids the secure enclave on the gaming card, extracts the keys, and builds their own open source hw alternative.

High-performance FPGAs are actually relatively cheap if you take apart broken elgato/bmd capture cards, just a pain in the butt to reball and solder them. But possibly the cheapest way to be able to emulate any chip you could want.

[–] tal@lemmy.today 3 points 11 months ago* (last edited 11 months ago)

someone delids the secure enclave on the gaming card, extracts the keys

Not a problem. You can potentially go for an attack on hardware, maybe recover a key, but you have a unique key tied to it. Now the attacker has a key for a single trusted computer. He can't distribute it with an open-source FPGA design and have other users use that key, or it'll be obvious to the server that many users have the key. They blacklist the key.

It's because hardware is a pain to attack that consoles don't have the cheating issues that PCs do.

[–] skullgiver@popplesburger.hilciferous.nl 13 points 1 year ago* (last edited 11 months ago) (4 children)

[This comment has been deleted by an automated system]

[–] conciselyverbose@kbin.social 27 points 1 year ago (6 children)

Server side anti cheat can’t distinguish good players from aimbots.

Neither can a rootkit, which should be unconditionally illegal and send CEOs to jail for putting in their product. There are no exceptions and no scenarios where it can possibly be acceptable for a video game to access any operating system anywhere near that level. Every individual case should constitute felony hacking, with no possibility of "user consent" being a defense even if they do actually clearly and explicitly ask for "permission".

load more comments (6 replies)
[–] savvywolf@pawb.social 11 points 1 year ago (2 children)

Server side anti cheat can't distinguish good players from aimbots.

I've been thinking about this, and I wonder how accurate this is. I think overuse of all this modern AI nonsense is a problem, but wonder if this might be a good use case for it.

A big game will probably have huge amounts of training data for both cheaters and non cheaters. An AI could probably pick up on small things like favouring the exact centre of the head or tracking through walls.

If a user has a few reports of aimbotting, just have this AI follow them for a bit and make a judgement.

It'll get it wrong sometimes, but that's why you also implement a whole appeals process with actual humans. Besides, client side anticheat systems also have a nasty habit of mistakenly banning people for having specific hardware/software configs.

However, I would like games to come with servers again so you can play games on your own terms

Please! Not just for anticheat reasons, but also for mods and keeping the game playable when the publishers decide it isn't profitable.

load more comments (1 replies)
[–] jaykay@lemmy.zip 3 points 1 year ago (1 children)

In case of CS2, it doesn’t even ban people who teleport behind you at the first second of the round. Or killing everyone through the whole map like here (Reddit): link

[–] Omega_Haxors@lemmy.ml 3 points 1 year ago* (last edited 1 year ago) (1 children)

I made an anti-cheat for vanilla minecraft once, it's REALLY easy to tell if someone is cheating it's just developers are grotesquely incompetent when it comes to detecting that sort of thing or (more often) just don't give a shit. They'll just create a naïve solution then never test it. For example: minecraft's god awful anti-fly and anti-speedhack which is just "is the player in the air for 5 seconds" or "did the player go too fast" which is notorious for false positives and doesn't even stop people trying to cheat, just punishes players for its own fuck-ups.

It really is as simple as creating a model of what the player should be able to do, and then nudging clients towards that expected play. Normal players will not even notice (or will be pleased when it fixs a desync) but cheaters will get ENRAGED and try to cheat harder before eventually giving up. The point of a good anticheat is not to punish players for cheating, but to make it easier and more fun to play within the rules.

It's like piracy: We had years of systems built on punishment and all they do is create resentment and people trying to break your system, but you build a system on rehabilitation and you become one of the biggest platforms for PC gaming with people willingly downloading it.

[–] dino@discuss.tchncs.de 3 points 1 year ago (2 children)

How do you propose to hinder aimbots and the like from working with server-side changes?

[–] Cirk2@programming.dev 5 points 1 year ago

how do you stop it on client side? I'm not sure if it has been deployed into the wild but these days computer vision is good enough to just work off the images. Capture image signal, fake usb mouse outputting movements calculated from image data. If this isn't already available it's only held back by the need for extra hardware.

[–] savvywolf@pawb.social 3 points 1 year ago (16 children)

I described a plan here: https://pawb.social/comment/4536772

Not perfect, but neither are rootkits.

load more comments (16 replies)
load more comments (1 replies)
[–] t3rmit3@beehaw.org 33 points 1 year ago (6 children)

I have run into maybe 3 people that I legitimately think were cheating, in 6+ years of CS:GO, and now CS2.

Where the hell are you running into this many cheaters?

[–] dreadgoat@kbin.social 10 points 11 months ago

On the flipside of this, I've been kicked from games because I know how to prefire, and a lot of players see that and just assume you're wallhacking. Nobody pays attention to the 70% of the time that you prefire at air, but when you guess right and instakill someone holding an angle, it's easier to say "cheater" than "i've been holding this same angle for the past 5 rounds, perhaps I've become predictable"

[–] jjagaimo@lemmy.ca 3 points 1 year ago (2 children)

It's basically luck of the draw with trust factor and region

I regularly run into cheaters who I watch the demo afterwards and they just sit there aim locked onto someone and tracking them through the wall for 10s before blasting them without ever seeing them, or react to things they can't see (e.g. suddenly flick to a corner someone is walking up to in a panic wo seeing or hearing a thing). Basically every other game has someone suspicious if not blatantly cheating from the start. If was bad in CSGO and it's 10x worse in CS2

[–] t3rmit3@beehaw.org 4 points 1 year ago (1 children)

I remember back in like 2016~2017 seeing one of those spinning aimbots with a wallhack, just sitting at CT spawn in Dust 2 and killing everyone on T. We all watched it for 5 minutes until it got VAC-banned. That one was hilarious.

I do wonder if West Coast US (where I am) is more heavily policed than other regions. That would make sense if Valve is doing some kind of post-match automated analysis of player behavior, which would probably be too compute-intensive to run everywhere.

[–] jjagaimo@lemmy.ca 3 points 11 months ago

Apparently East coast is just a FFA. Ive played in EU and West Coast servers with friends and they're definitely better about cheaters

[–] Emphimisey@aussie.zone 3 points 11 months ago (1 children)

You are either using hyperbole or you are lying. VAC is an incredibly good AC for CS. To have a cheater every other game is not possible unless your trust factor is in the basement/you are at 20k+ (which I doubt)/ or you are really low like <3k (most likely).

Game sense is a big thing in CS and it can be the reason for a lot of decisions that people make, which can be thought of as cheating. Go watch professional LAN tournaments of 1.6 especially on Nuke and see all the wall bangs that happen that's not wall hacks it's game sense.

load more comments (1 replies)
load more comments (4 replies)
[–] MentalEdge@sopuli.xyz 29 points 11 months ago* (last edited 11 months ago) (6 children)

Cheats will only grow more advanced, at some point you'll be able to train an AI to play exactly like a human, but while performing perfectly far more reliably than a human.

The line between what skill looks like versus cheating will only get blurrier.

The real long term solution is to enable the vetting of players (not by the game company or god forbid the government, looking at you china), by returning to community based servers/private matches. And to have reports dealt with faster and by people who care about the game personally.

As a member of the Northstar community, cheating is basically a solved problem for us atm.

There is no anti-cheat, instead a global ban tracking system was put in place and server admins are now able to share the identities of players who have been caught cheating, banning them on every server, regardless of who is running them, by the hosts simply opting into the global ban system.

People used to form "gaming-clans" in order to find people to play games with to begin with, and that structure for a community around a game is likely to become relevant again simply to be able to fill matches with people who you can be sure are honest players.

[–] gk99@beehaw.org 7 points 11 months ago* (last edited 11 months ago) (1 children)

People used to form "gaming-clans" in order to find people to play games with to begin with, and that structure for a community around a game is likely to become relevant again simply to be able to fill matches with people who you can be sure are honest players.

Unlikely imo, because modern game devs have been killing the viability of that for years. User-hosted servers are gone, crossplay is reliant on SBMM to be realistically possible, and private matches often block players from receiving XP and rewards because they're worried about FOMO and people getting too much fun without spending enough. Even CSGO got an update in the months leading up to CS2 where they removed the ability to earn drops on community servers, driving another nail into the coffin as one of the last kinds of these games that still retain the mere ability to run servers of our own.

[–] MentalEdge@sopuli.xyz 8 points 11 months ago* (last edited 11 months ago)

While that's all true, the day you can just fire up an undetectable AI to play for you, and all the matchmaking queues are flooded with people doing the same... Players are going to beg for the ability to not just team up with people they know, but play against people they know.

Maybe that wont be privately hosted servers, or even fully custom matches, but when cheaters become indistinguishable from the highly skilled, forming even the most basic community bonds in order to find people to play with will be preferable to matching with randos.

For similar reasons people already prefer to team up with someone they know, as opposed to a stranger they might have to carry. People will want to be able to pick who they go up against, as well.

Once the cheaters win, (and they will) the first game to figure out a system to let players do this, WILL be a better experience than current matchmaking algos.

Edit: An example of a game that kinda already does this is Elite: Dangerous. There are two main modes, open and solo, in open you can run into all other players also playing in open, that means you might have to defend yourself against other players.

But, if you want to avoid PvP, but still want to run into other players, you're in luck! Because there is a third option, private groups. When in a private group, the game works as if you're in open, but you can only see other players who are in the same group. Meaning other players who also do not want to engage in PvP.

Mobius is likely the largest such group, essential it's a giant clan of non-PvPers who play the game together. Something similar could absolutely be done for other games, where smaller communities can then vet their members and get rid of players who break the rules.

[–] dino@discuss.tchncs.de 3 points 11 months ago (2 children)

server admins are now able to share the identities of players who have been caught cheating, banning them on every server, regardless of who is running them, by the hosts simply opting into the global ban system.

By which information? I have no clue what Northstar is, but if you ban by IP or MAC, its pointless.

[–] xep@kbin.social 8 points 11 months ago

It's cat and mouse when it comes to banning, even with hwid signatures the cheaters are able to use sophisticated spoofing techniques. Also there are side effects like legitimate players buying second hand pcs that have been banned.

[–] MentalEdge@sopuli.xyz 6 points 11 months ago* (last edited 11 months ago)

EA account ID. Northstar is the community modded version of Titanfall 2.

load more comments (4 replies)
[–] fwygon@beehaw.org 20 points 11 months ago

Most anti-cheat software can't do much on the client side. Really all it can do is look around at it's environment where it's allowed to look and see what's going on.

Most Cheat Software will run on a higher privilege level than the game; whether that's as an "Administrative" user or as "root" or "SYSTEM" in a context where it's running as an important driver.

In any case, the only thing the Anti-Cheat can reliably do on the client side is watch. If it's cleverly designed enough, it will simply log snippets of events and ship them off for later analysis on a server side system. This will probably be a different server than the one you're playing on, and it won't be sending that data until after the match has ended properly.

Sometimes it might not even send data unless the AC server asks it to do so; which it might frequently do as a part of it's authorization granting routine. Even when it has the data there may not be immediate processing.

Others have also mentioned that visible action may be delayed for random time periods as well; in order to prevent players from catching on to what behaviors they need to avoid to get caught, or to prevent cheats from getting more sophisticated before deeper analysis could reveal a way to patch the flaw or check to ensure cheating isn't happening.

Since cheat software can often be privileged, it also has the luxury of lying to the server. So clever ways to ensure that a lying client will be caught will probably be implemented and responses checked to ensure they fit within some reasonable bounds of sanity.

[–] Megaman_EXE@beehaw.org 13 points 11 months ago* (last edited 11 months ago) (2 children)

Valve works differently than other companies. Internally everyone works on what they want when they want. You can literally wheel your desk to a new location if you decide to want to work with another team. Because of this though it creates an odd dynamic that isn't always going to work out best for the developers or the consumers.

This is why it feels extremely random whenever valve releases something new. You would think they would just release banger after banger of hot AAA titles. But it's more complicated than that unfortunately. This is also reflected in things like fixing cheats in team fortress 2 etc etc.

This video will answer a lot of your questions

https://youtu.be/s9aCwCKgkLo?si=YoE9G-S80xf7JjLO

load more comments (2 replies)
[–] DmMacniel@feddit.de 5 points 1 year ago* (last edited 1 year ago) (1 children)

Valve uses EAC? The fuck? What's with VAC?

Besides that, EAC works without issues on Linux.

[–] dino@discuss.tchncs.de 3 points 1 year ago* (last edited 1 year ago)

Oh did I mix up those two? I was actually referring to VAC, need to check who is responsible for EAC.

Edit: Apparently its Epic, I will correct the beginning post.

[–] Kolanaki@yiffit.net 4 points 1 year ago* (last edited 1 year ago) (3 children)

My only issue with VAC and Valve's policy on dealing with cheaters isn't that their anti-cheat isn't good; it's that even when a player is flagged for using cheats, it doesn't instantly remove them. It waits, sometimes months after flagging them before it actually doles out punishment.

Their reasoning is that it slows down new cheats being made; but what the hell does that matter if the existing cheats it flags are still allowed to be used for months at a time?

That said, it's rare I encounter cheaters in CS. Plenty of other games I play where there are constant, obvious cheaters that aren't even being detected by the anti-cheats in use, since they ban instantly on detection.

[–] xep@kbin.social 9 points 11 months ago* (last edited 11 months ago)

Giving cheat authors instant feedback in terms of detection results in cheats getting better at evading detection more quickly.

This is standard practice in anti-cheat methodology, and is generally agreed upon to result in much more positive long-term outcomes.

[–] Akrenion@programming.dev 6 points 1 year ago (1 children)

This is for paid cheats. If you ban in waves the companies selling cheats lose a larger sum of money all at once. This also hits right at the time when they need to put in the most work.

[–] jjagaimo@lemmy.ca 3 points 1 year ago

The problem is that for detection of identical programs, vac relies on program signatures. You could make slight changes to to program to change the signature and recompile it, or use something that changes the signature every time you compile it. That means even though those running the cheats are using essentially the same program sold to them by the same person, if one gets banned then VAC sees the other program and goes, "I've never seen this program in my life"

Other anti cheats will try to identify programs by their functionality (e.g modifying or reading memory of other programs) and using heuristics but that is both more invasive and requires higher level of privilege which many people aren't willing to give.

The other alternative valve is experimenting with is AI to detect aimbot, which could work in some instances, but is prone to false positives, and isn't able to as easily identify behavior such as walling

load more comments
view more: next ›