Hacker News new | past | comments | ask | show | jobs | submit login
Team Fortress 2 source code has leaked (techradar.com)
309 points by adam_fallon_ on April 22, 2020 | hide | past | favorite | 158 comments



What is up with the strange sensationalist claims in the article on and Twitter? Source code availability is not a prerequisite to people finding vulnerabilities or RCE exploits in games, there are many established games with open source game clients. Security researchers routinely reverse engineer proprietary software.

Bizarre.


Source code availability makes it a lot easier to find vulnerabilities. Open source code is much more likely to already have been audited better. Closed source code often depends more heavily on security by obscurity, and unexpected source release can definitely make vulnerabilities immediately apparent that weren't known prior.


> Open source code is much more likely to already have been audited better.

Common wisdom. I just happens to not be true. People just aren't auditing random code on github for fun. Auditing code is hard, and time consuming. Most vulnerabilities are found by techniques like fuzzing, not by combing through thousands of lines of code.


I used to do it, every time I installed a new package/game/service I'd look at the code. That resulted in a whole bunch of security reports.

I still do it for fun, but not methodically, and not regularly. It's a great way to look at code, to learn, and sometimes it pays off.

e.g. Reporting a bunch of trivial predictable filename issues in GNU Emacs, including something referring to the (ancient) Mosiac support:

https://bugs.debian.org/747100

Fuzzing is definitely useful, and I've reported issues in awk, etc, but fuzzing tends to be used when you have a specific target in mind. I'd rarely make the effort to recompile a completely random/unknown binary with instrumentation for that.


That is awesome.


The point that "Open source code is much more likely to already have been audited better." is actually true, but with the caveats that 99% of code isn't audited at all, and the 'better' claim is dubious. Security-focused devs audit OSS projects for practise, for bounties, for the glory of finding something in a popular codebase, and just to contribute their skills. It does happen.

In the closed source world, very few companies will pay for their source code to be audited, because it's expensive and time-consuming, and most only do it if they're required to.


> very few companies will pay for their source code to be audited, because it's expensive and time-consuming, and most only do it if they're required to.

And even when they do, in my experience, they usually end up buying an expensive automated report that provides little or no real insight.


> they usually end up buying an expensive automated report that provides little or no real insight.

That's totally what we are in the process of doing (with even two separated tools for even more wasted time!)


Code intended to be closed and released unexpectedly seems like the worst of both worlds though.


Just adding my 2c from a few (horrible) years of WordPress development:

When writing integrations between third-party modules, the documentation was rarely enough, so with open-source ones, I generally went through at least half of the total (backend) source code of each to find all the hooks I needed and would semi-often find fairly standard security issues and report them.

In contrast, if a plugin was closed-source and obfuscated, I would just go bother their support, and so their code was never looked at by anyone other than the 2 core devs. When I inevitably had to reverse-engineer parts of the code anyways and discovered issues, I got far more "hey, you broke the EULA!" responses than "thanks for the report".


Developing an Open Source software can have some incentives to have cleaner code.

Talking from personal experience. Here are my though on it:

First, with Open Source software, you expose (potentially) what you write to the while world, as a consequence, you don't want to feel ridiculous by publishing atrociously bad code. In contrast, in a corporate environment, the code is not seen by so many eyes if at all, and even if read, the readers are roughly a known quantity.

Second, You have far more time/freedom, specially if it's a personal project, to think about design/code architecture, and to rework things if required. You can also spend time on things like unit tests, fuzzing, etc. Basically, you can more easily work on all these things that are "valuable" but difficult to "quantify/measure".

Third, people working on OSS projects are generally a bit more motivated, either because it's their subject of interest, or because of the general appeal of OSS.

Forth, often with OSS, you actually have more resources/service available to help with code quality. CI, static analysis, dependency auditing, etc is one click/integration away. In a Corporate environment, the procurement for such services can be off-putting, integration setup can be an up-hill battle, and there can be strong restrictions regarding external services.

Just as an example, in a previous job, I tried to get budget for a Jenkins server, never got it, so I ended-up "stealing" an old abandoned server from another project, once I got it I ran into issues not being able to configure the post commit hooks to trigger a CI build. Heck, in this job, even my home desktop was a 3 or 4 times better development machine than my work laptop.

Things are changing slowly, having a good code coverage is more and more a goal if not a company policy, code review processes are more and more common, companies are a bit less paranoid about trusting external services for CI/analysis/fuzzing. But from what I have been able to see in most of my job, proprietary code bases tend to be lower quality than most OSS projects. Even the code I've produced in my free time tend to be better than the code I've written at my job.

It's not an absolute however, you can still have terrible OSS code bases, it's just you are a bit more geared towards better code quality in OSS projects.


> People just aren't auditing random code on github for fun

Yes, they do: https://www.fsf.org/blogs/community/who-actually-reads-the-c...


Maybe the person you are replying to should have qualified “popular open source repositories”.


Like openssl? Rhetorical question; OpenSSL was both open source and broadly used, and it took over two years to identify heartbleed.

Plus. many companies, Microsoft included, open up their source code to partners.

The openness of source code has little correlation to its security.


I know it's not quite that simple but isn't OpenSSL exactly an example of how a bug in open source software was found and fixed? Of course it took a while and the software was already extremely widely used at that point but bugs happen and at least it's not just lying around unfixed. I can't remember bugs in closed software getting the same kind of exposure.


I'm not sure if heartbleed is a good example here, given that it was basically a new class of exploit.


Wasn't heartbleed a fairly typical buffer overflow?


the typical buffer overflow would have been caught by OpenBSD's protective malloc.

> [...] OpenSSL adds a wrapper around malloc & free so that the library will cache memory on it's own, and not free it to the protective malloc. [...] So then a bug shows up which leaks the content of memory mishandled by that layer. [...]

https://marc.info/?l=openbsd-misc&m=139698608410938&w=2


I don’t think the vulnerability was in malloced memory, it was some buffer on the stack. I’ve actually patched OpenSSL to stop heart bleed as an excersice and iirc the fix was in fact just preventing a typical buffer overflow.


Seems like that commenter is also saying that it would’ve been caught as a regular buffer overflow bug?

> OpenSSL is not developed by a responsible team.


I've always thought of buffer overflow as writing beyond the intended bounds of the buffer.

Heartbleed is reading beyond the intended bounds remotely. I don't think there were similar attacks before hand, but I could be wrong. I only have a base level knowledge here.


Infoleaks are nothing new.


> People just aren't auditing random code on github for fun

No, just the important code that everyone is running.


You don't have to audit that. It's so popular, someone else must have done a thorough review already!


Afaik it had the opposite effect for OpenSSL. Not only was the code so bad that it would crash if ran with a secure malloc implementation. Due to being free and open source nobody felt the need to donate[1], with only one developer employed to work on it full time.

[1] https://arstechnica.com/information-technology/2014/04/tech-...


Well. eventually someone looked at it. And probably Heartbleed has been used a long time before it was published.


I have to confess that I have run afl on random code on github.


>> Open source code is much more likely to already have been audited better.

> It just happens to not be true.

I think it's true, free and open source code, on average, is more likely to have been audited to a greater extent. I think most people confuse "audited to a greater extent", "coverage" and "security". The paradox here is: Merely increasing the chance of fixing bugs do not automatically guarantee security by itself, nor guarantee a sufficient code coverage.

For example, if the codebase has 10 serious RCE exploits, if it's a binary-only program, the pentester may be able to find 3, but if it's FOSS, the community might be able to find 5. Yet, the remaining RCE exploits are still exploits. And paradoxically, a project can have fewer exploits than a binary-only alternative in a numerical measurement, yet, the mere discovery of a new exploit can create a huge storm and lead to the perception of the software being less secure, even if it's objectively false in a numerical measurement.

My opinion is, free and open source code, in general, often objectively reduces the number of exploits comparing to its binary-only alternative. It doesn't eliminate exploits. The important question here is code coverage - A group of random hackers browsing code can never replace a systematic review (organized by the community or otherwise). Nor it makes the software inherently secure, a program that uses suboptimal programming techniques is more prone to exploits and more reviews cannot change the fact. However, the exploits discovered and fixed by a group of random hackers are still real security improvements.

For example, OpenSSL, even before Heartbleed, were attacked by researchers around the world, some exploits involved advanced side-channel and timing attacks. The bugs they discovered and fixed are real. Now imagine a binary-only OpenSSL in a parallel universe, called CloseSSL (while other conditions - such as an underfunded development team, remain the same), in this universe, fewer exploits are discovered and fixed, and it may be more vulnerable to timing attacks than the OpenSSL in our universe, so objectively it's more "secure". But both are vulnerable to Heartbleed, in other words, being more "secure" in a numerical measurement does not translate to real world security, on the other hand, the numerical measurement showing the superiority of FOSS by itself, is nevertheless real. Of course, real-world programs do not behave like an ideal model, being FOSS or not also correlates to other variables such as the the size of funding or audit coverage. My argument is only an ideal model treating all variables as independent variables.

I call it the anti-Linus's law: Given more eyeballs, not all bugs are shallow, unless there's enough eyeballs. But it's always better than fewer eyeballs.

> Most vulnerabilities are found by techniques like fuzzing, not by combing through thousands of lines of code.

Having the source code available allows pentesters and auditors to use compiler-based instrumentation for fuzzing, which is more efficient than binary fuzzing.


> Having the source code available allows pentesters and auditors to use compiler-based instrumentation for fuzzing, which is more efficient than binary fuzzing.

I will concede that is pretty valid point. My argument is basically that there is the false sense of open source code being "more secure" because of an assumption that "the community" is checking it thoroughly. Most people will just grab it off of github and run it, without giving it a second thought at all. Generally speaking you don't get high quality full code audits for free, pentesters and auditors generally like to get paid and aren't out there testing github code-bases out of the goodness of their heart.


I have to believe given the sheer size of these communities, that the source code being available only helped to confirm what was already known. The panic seen here hearkens back to the days when companies made similar ridiculous security claims about open source software compared to proprietary software.


That seems like quite a stretch. The difference between having the source code and not having it is night and day as far as exploring potential vulnerabilities...which is one of the strengths of open source as you point out, but this code was not intended to be || written as open source hence the panic. Feel like you missed the mark on this one.


> Open source code is much more likely to already have been audited better.

Worth keeping in mind this isn’t a silver bullet. OpenSSL with Heartbleed comes to mind.


Very true, but OpenSSL in particular is rather infamous. Unfortunate given that so much relies on it. https://news.ycombinator.com/item?id=7556407


This is why the assumption that “open source code is more likely to be closely audited for vulnerabilities” is not true (even for incredibly core/important projects with a wide scope) and is potentially dangerous to rely on.


> This is why the assumption that “open source code is more likely to be closely audited for vulnerabilities” is not true...

That is a safe assumption, otherwise you'd have to believe that non-open source code is more closely audited - at greater expense, because businesses secretly prioritize security.


It is not 100% and always. But practically. Especially unexpected leak.


Shellshock still outperforms any security issue of OpenSSL in terms of time in the wild.


open source code being more secure is a myth.


The whole "open source is audited better than closed source" is nothing but a myth and I am actually quite surprised to see this statement appear on HN.


Every statement you just made is speculation and not backed up by any meaningful data. While it’s obviously “easier” to find bugs when you can view the source code, making it one or the other doesn’t bestow any magical protections on the software.


"Time and effort required" in order to find vulnerabilities is not a magical protection. It is a legitimate protection. Not one that should be relied on, but very much something that factors in. Open sourcing software doesn't immediately improve security, but it drastically lowers the barrier of entry for researchers to start looking into it.


I’m wondering the same thing. Is there any evidence of an RCE bug out in the wild? Or was it just wild speculation because the source code is now available?

Unless they specifically hardcoded a back door into the game, I’m dubious a leak would result in an RCE so quickly, if ever.


>Unless they specifically hardcoded a back door into the game, I’m dubious a leak would result in an RCE so quickly, if ever.

AFAIK, parts of the source code have already been leaked since 2018 amongst certain circles outside Valve. It's only been in the past few days that this is now common knowledge.


I'm assuming that whomever leaked the code modified it and added a remote exploit to the codebase and that's what folks online are referring to. Happens a lot with shady non-scene type of warez.


So that would affect people who got the code, setup their build environment, built that code and then ran it??


Yes, if this is in fact true and not a rumor I would expect so. No one else other than folks pirating and running the leaked code is affected.


Allegedly there's already an exploit in the wild that lets you open a popup in game to all other players in a server. You can find screenshots if you look around the /r/tf2 subreddit.


"allegedly" means nothing and screenshots are so easy to fake, it's 2020. I want to see concrete proof of this alleged exploit.


I remember a custom CSS server doing this. The admin would fire off some command and a typical in-game browser window would show that would immediately go to a site the admins ran that hosted audio files. One would start playing. You could turn it off but they could push out the link again.


I've been hearing that an RCE for TF2 is confirmed but not for CS:GO or the other leaked code/clients.


The csgo twitter says csgo should be fine. (At least on valve’s servers)


As someone who's reverse engineered large portions of a game with similar tech from the same era, I would be absolutely shocked if there were not remote exploits.


If it was open-source from the beginning, there wouldn’t have been this problem.


The leak also includes most of something called F-STOP, a cancelled Portal project, that apparently looks like Superliminal: https://www.reddit.com/r/Games/comments/eebagv/gameplay_of_a...


The F-STOP included in this leak, from my understanding, isn't Valve's prototype. It's just in-development fan-made game to try and replicate Valve's unreleased prototype.


Strong evidence that this is fan-made: there is Cave Johnson dialogue in the video [0]. F-STOP significantly predates Portal 2 and I presume the Cave Johnson character as well.

[0] https://www.youtube.com/watch?v=HboQWe3FYbg


Cave Johnsons name was referenced in portal 1. It could be they already had the character idea in mind


Hm, the linked video is a bit underwhelming considering all the hype that this secret mechanic had. The "polaroid effect" prototype that i saw some time ago is much closer to what i'd expect the game to be (and IMO has way more gameplay applications, though it can also be easy to break):

https://www.youtube.com/watch?v=ran_yU65Xmg

Honestly, i expected something a bit more involved than copy/pasting and scaling objects. And even then, this old ldjam game (Tale of Scale) does it better IMO:

http://ludumdare.com/compo/ludum-dare-25/?action=preview&uid...

(well, it doesn't do the copy part, just the scale part but still feels better)


Here's a gameplay video which I assume is from this leak on account of the upload date: https://www.youtube.com/watch?v=HboQWe3FYbg


I think it's obvious why FStop wasn't finished... It looks boring. The gist seems to be that you can use the device to move and resize objects. Kind of cool but looks hella tedious. I always assumed it was going to be about making different sized portals so you could change size of things (including yourself) by passing through.

Those non-euclidean rooms at the end of the video are sweet though. Honestly the most surprising thing is that we didn't actually see any of these in the portal games given that it's the same exact tech required to make it work as the portals themselves.

Edit: The concept looks boring, not the obviously unfinished tech demo


One of the best uses of portals I've seen was on The Stanley Parable game with seamless brainfuck moments like this: https://www.youtube.com/watch?v=8_UcYPaQWrI


Antichamber is a puzzle game that uses the same thing in a ton of extremely fun ways. I highly recommend it.


I second Antichamber, it's an overlooked gem. It combines the mind-bending puzzle platforming of Portal with the intriguingly interconnected world design of Metroid.


A really interesting side effect of this is you can do it in VR to make room scale maps which are really really big. If done properly, our brains don't notice it nearly as much as you'd expect. This video does a great job of making it excruciatingly obvious. I love it.


I don't think you can be confident that this is F-STOP, multiple indications support that it is fan made.

There is an obvious reason why F-STOP wasn't finished though: Valve never finishes anything. Episode 2 came out 13 years ago.


It could be fan made, but the camera mechanic seems to be from the source code, no? There could be MORE to the mechanic that isn't implemented but this piece at least looks lame

edit: and for completeness, nowadays I feel compelled to point out that the latest half life game came out just a month ago


Right, this why I mention the thirteen years. That's what it took to get a new half life.


Cheats and hacks were already bad enough, I imagine this won't help :( Darn, one of the most fun games I like to play.

Sadly, OSX Catalina killed the game for Mac users because Apple recognized the extreme demand by casual users to break all their old 32bit applications.


I wonder if it would be possible to use this source code to build a 64-bit version?


Security by obscurity... isn't.


agree... but it's just a video game :/


A video game that had millions of users


and that changes things how?


Just a video game? Imagine someone plays this game inside a corporate network, gets compromised, and the data for millions of people is stolen and leaked into the wild.

The function of the actual program that gets exploited is irrelevant to security risk.


https://en.wikipedia.org/wiki/Straw_man

I was referring to the relative importance in context of the parent comment. No need to conflate my words.


I literally just downloaded Steam on the mac to show my boys Team Fortress only to be met by this sad news :(

I really hope something like TF2 resurfaces in some form again. I never liked the feel of Fortnite.


You should check out Overwatch, which I consider to be TF3. They reimplemented many characters 1:1 including soldier (pharah), medic (mercy), demo (junkrat), engi (torj), as well as the 2 point capture and pushthecart modes.

Valorant, a beta game from Riot, is then a blend of overwatch + csgo, making it closer to tf2 6s.


Having played both quite a lot, I believe the skill ceiling of TF2 characters is much higher. Plus, rocket and sticky jumping are just too much fun to give up (sorry, phara and junkrat just don't compare). I like the old-school movement abilities that held over from the Quake Engine. I'm also not a huge fan of timeouts and cooldowns for everything.

Finally, I like all the different modded servers out there and the ability to play casually when I want to just mess around.


> reimplemented many characters 1:1 including soldier (pharah), medic (mercy), demo (junkrat), engi (torj)

I would disagree that these are 1:1 (they share some abilities at best) but would agree that elements of TF2 and their character abilities are indeed nearly 100% spread out across different characters in Overwatch.

E.g. Medic's uber = Ana's nano, engineer's teleporter = sym teleporter, engineer's turret = Torb's turret


Overwatch and TF2 do not really have that much in common. The skill in TF2 is far more movement based.


The person you're replying to said he was disappointed to find that MacOS Catalina can't run TF2. Why recommend a game that won't run on any version of MacOS?


> I really hope something like TF2 resurfaces in some form again. I never liked the feel of Fortnite.


Install Windows? Simple enough solution


What about Overwatch?


Playing overwatch means playing the way blizzard intends you to play. No new maps, no private servers, no admins to get rid of jerks, etc...

The game mechanics seem largely the same with respect to a lot of the characters, but then they added in one-directional shields and turned it into team strategy game with much less room for individual skill -- again with no real ability to just run your own server and mandate a different playstyle.


>one-directional shields and turned it into team strategy game with much less room for individual skill

There's plenty of room for individual skill, which goes beyond hitting Q. One-directional shields only reveal that teamplay is necessary, by themselves they don't add the necessity of teamplay. A poorly coordinated team doesn't do a good job. Even if both teams didn't use shield tank heroes, you'd still need just as much coordination.


They've had custom games with the workshop that can serve most of what you request here, might be worth checking out


> Overwatch is the first game from Blizzard to hit consoles the same time it was available on Windows PC. It's also the only game from the company that isn't on the Mac


if you're gonna game on a Mac, i'd suggest installing windows via bootcamp. ime most games simply run better that way anyway and you don't have to deal with the issue of games that aren't MacOS compatible.


Which would be a great suggestion to the person wanting to play his favorite game on the Mac, instead of recommending another non-mac game, wouldn't it?


Parsec from an actual gaming system works very well.


Dont worry, Valve's official statement is that everything is fine and this was already leaked and patched in 2017 :)


Maybe it will actually get Valve to fix it or let mods be the new anti-cheat.


Maybe they should just embrace it and open source the code (without giving away the trademarks). The community is pretty engaged, so I bet it would be a boon for both security and modding.


I don't know what this contains, but many times games cannot be open sourced because they use libraries that would have to be removed before they could open source it.


I've heard this a few times as well, but couldn't the game still be open-sources just without those libaries?

And anyone who wants to compile it, needs a licence for those libaries, which in many times is free for noncomercial purposes, or students. Btw. what expensive libaries do exist in that area anyway?


So I've been involved in this sort of process before. IANAL but the answer likely yes, but it is complicated.

So, firstly IP rights may not be your only encumbrances, NDA's can be even more restrictive, since leaking information about a library may be covered, for instance.

Additionally, the IP of the game code itself may be encumbered, for instance with publisher agreements, or if your code is derivative, source for instance may still be considered partially derived from id code, which zenimax, or maybe some other party may own, and getting those rights may be difficult, (even if some large percentage may be released in the GPL'd id codebases) And if your core engine is IP encumbered, that may not be something you could "just release without"

So then someone is going to have to actually do the work of separating out all the third-party libraries, which may not be trivial depending on how many, and how well separated they are.

Then at any reasonably risk-averse company, somebody is going to have to do an audit, which could be a lot of work.

And then we might not just have other people's files to remove, while it may not be an copyright violation to reference API calls of a copyright work (I'm honestly not sure) It sure could be an NDA violation depending on the NDA. Not to mention code that may be derived from library samples. So you either have to cut out all of that code, or rework it to just not be in violation.

And lastly, most companies care enough about their reputation to not want to just dump a large pile of broken code in the wild (Maybe it'd be better if they would) but want to make sure it builds and runs. So once you've removed all those other bits, it may be a lot of work to just get everything building, or running again. And not to mention that asides from just IP issues, a lot of the build system may depend on local infrastructure, perhaps connecting to local databases, or cache servers, so you will want to make it work without that, and we also have to ensure no secrets are accidentally being divulged (maybe signing keys, or credentials to servers)

As I understand it, Valve licenses Havok, which they possibly use for any and all collision (or maybe just rigid-body stuff, who knows) and if so, even if you get the game running you may fall through the world, or perhaps not be able to move at all, which is hardly the TF2 we all want open-sourced. And that's just one possible library, maybe they use RAD Granny to do animation, etc.

Or if you want to allow people to buy the licenses to compile it, you still have to do almost all of the above, then setup the infrastructure to download the correct version, and set it up for builds, and that only works if they didn't make any internal modifications, or maybe they can setup a patch file that doesn't violate any copyrights, but that's also work. And that also implies that the company the libraries are licensed from still exists and is still selling, and still offers the old versions the game is built with. And now you have to release the engine under a license that's compatible with the third-party licenses, which given the precedent of using GPL that adds some extra complications.

So, yeah, there is a good chance it's possible but there is a varying amount of work, which is most likely at least a lot.

As for libraries that exist in that area, of the top of my head some examples:

* Havok/PhysX, physics library

* FMod/WWise, audio libraries

* Natural Motion, animation libraries

* Everything that RAD Game Tools offers, including audio/video codecs, compressors, animation libraries.

* Scaleform, animation libraries

* SpeedTree, tree modelling libraries

* Enlighten, lighting (global illumination) libraries

* Platform specific libraries

* NDA encumbered IHV libraries

Keep in mind, some of these are expensive, but some are more dependent on a corporate relationship, which is not the sort of thing that frequently offers a student or noncommercial version.


Thank you for your detailed answer!

But I now go to bed, dreaming of a world, where Open-Source is the standard and IP and NDA madness forgotten ..


Ohh, I agree, I've been working on trying to make this happen for a project I worked on previously for about 4 years now. Have gotten some traction.


They could be using those libraries to make market items being sold.


Ah yeah, the ingame market. A whole different story.

But if it is centralized, you should not be able to tamper with it, much, even if you have your own local version running?


Good point. IP law is such a headache.


Reverse engineer everything that is worth reverse engineering! :)


@valvesoftware has commented on Twitter as of ~25m ago, I guess, by retweeting https://twitter.com/CSGO/status/1253075594901774336



This is another reason why I have a computer for specific uses. One used exclusively for gaming and nothing else, with the same for bills, random browsing, and development. It's much easier for maintenance and sanity.


The more common and realistic attack is anti cheat software and DRM. They are functionally identical to rootkits/spyware in that they install themselves as kernel drivers and monitor literally everything you do (sometimes when the game is not even running) to work out if you are cheating/attempting to crack the game. And we have no idea if they are limiting themselves to just that, they could be selling your browsing history on the side as well.

The DRM is super invasive as well, A lot of wine developers have to deal with denuvo repeatedly banning them as well as one user reporting that they were banned from valorant after plugging in their phone to charge it.


I do the same. One for work, one for personal, one for gaming and all on their own isolated networks.


Once you have the source code, how easy is it to "build" the game?

Using this code could I edit the character models so that certain characters looked like Sesame Street characters and then publish that game to my personal PC for my kids to have fun with?


You could already do that even easier with the modding tools available for TF2


Maybe the community can help recompile the Source engine for 64 bit on macOS now.


Is there some kind of secret agent inside Valve? Half Life 2 source code got leaked before it’s release date as well (or parts of it).

The TF2 subreddit announcement: https://www.reddit.com/r/tf2/comments/g64t0b/data_leak_warni...


I doubt it, due to how the leak is purportedly from 2017/2018 code, and according to the tweet from SteamDB is the version that is included to Source engine licensees.

The original hl2 code leak was a fan from Germany that hacked into Valve's network and stole a version. https://arstechnica.com/gaming/2016/06/what-drove-one-half-l...

There are YouTube channels like VNN that rely on Valve leaks, but most of it seems like running `strings` on their update files. He does claim to have some inside sources, but they mostly seem to provide social commentary on Valve internal politics.


It was actually leaked to VNN by a Valve employee. VNN then gave it to a small group of friends, one of which went crazy and leaked it.

Refer to this r/Games thread for more details: https://reddit.com/r/Games/comments/g61v4x/_/fo6r9ef/?contex...


VNN indicates that he never had access to the code.

https://twitter.com/ValveNewsNetwor/status/12529744828321382...

He also re-tweeted this account of the events https://twitter.com/JaycieErysdren/status/125300494000139878...


Here's a video from Tyler (VNN) talking explaining the leak: https://www.twitch.tv/videos/599332362

tl;dr: A Valve employee in 2016 told Tyler about a few things Valve was working on. Tyler kept the chat log and shared it with friends. Unrelatedly, TF2 and CSGO's code was semi-publicly leaked in 2018, and one of the friends found and held onto that. Tyler and the friends worked on a fan project together named F-Stop, after an unreleased Valve project, but then one of the friends had a falling out with the rest of them and published this stuff together, probably to embarrass him for sharing the chat log and maybe to make him look involved in the TF2/CSGO leak.

A little more confirmation: https://twitter.com/TeamFortress/status/1253186403900420098


That was 2004, so not a very busy agent...

I’m slightly shocked with the phrasing in that post “It is definitely possible that someone could install a virus on your machine by just being in the same server.”

That.... seems like a pretty shocking security hole, unless they are talking about unknown possibilities, in which case the term “definitely” is a bad choice. If this can be done with the source, it could have been done before, no?


That sounds pretty reasonable to assume for any game, even those that are singleplayer, if they access the network.

Game code is particularly known to be "spaghetti", "code cowoy"-style, where the result is more important than the form or correctness. I mean, that's art, after all, so that seems obvious.

And do you think a lot of companies update their games after they are out? Most often, the code is definitive, refactors are out of the question, etc. I've never seen a bug that fixes a security issue (CVE), let alone for old titles.

And that's when RCE is not by design. It is in Garry's mod, but that's for client-side mode scripted with lua, so theoretically sandboxed. Unreal Tournament 99 though, has plenty of servers that put some dlls for "anti-cheat" software on your computer before you join. That one probably sn't sandboxed.

While we talk about anti-cheat software, can we think a moment about everything that could go wrong with a piece of software that has a very deep access to the system, is sometimes in-house, and not necessarily audited, and whose functionality often includes:

* downloading challenges from servers, patch them into RAM and see what happens

* scan the RAM of the whole system, plus the filesystem, for known exploits

* upload parts of that RAM and filesystem to random servers for analysis

* take screenshots, log keypresses, monitor the system and upload all of this.

Takeaway: sandbox your games. There's a reason I run Steam in a flatpak, on Wayland... Convenience is part of it, but that's not the main one.


> sandbox your games. There's a reason I run Steam in a flatpak, on Wayland

If flatpak works perfectly, I suppose an attacker could still steal the "cookie" that automatically logs you into Steam.

Ideally you want Steam to be sandboxed, and then Steam to in turn run all the games in individual sandboxes.


I agree, and that's unfortunate, but I value it far less than I value the integrity of my computer and the data on it.

Steam itself has an interesting "Linux runtime" option for games, but it is unclear if that isolates things more than the status quo.

I don't know what I could do, short of replacing every executable in the steam directory with something that uses a mount namespace or a similar restrictive mechanism before launching the actual executable. Inject a modified libc to perform this on steam's exec call? I think the ball is in Valve's camp to improve this.


> Unreal Tournament 99 though, has plenty of servers that put some dlls for "anti-cheat" software on your computer before you join.

D:

People put up with that?


For reference, the anti-cheat plugin usually used by Unreal Tournament servers is AntiCheatEngine ("ACE")

https://ace.ut-files.com/index1a8f.html?p=about


Battle.net has been doing that since day 1, so if you played any game on Battle.net you have downloaded server provided code and executed locally with the privileges of the user running the game.

(when a client connects to a battle.net server, one of the early handshake steps is to download a fixed named MPQ file, which is a Blizzard proprietary archive protocol which contains a DLL that is loaded and a certain fixed named function runs from it, which will checksum your client binary and send the result to the server to compare and allow you to progress further)


I think there's a big difference between the game downloading a DLL straight from the game developers (not all that different from an update) and a game downloading a DLL from a random server you join (that could be run by anyone that you have no reason to trust and that you don't realize you're giving them full read-write access to your computer).


Exactly. Neither is ideal, but they're not exactly equivalent...


People are defending Riot Games installing an anti cheat driver, so that's not very surprising if you ask me.


Not just that, the driver starts on boot and stays running even when the game isn't running.


And removing the game doesn't remove the driver. You have to remove it separately.


Better than putting up with cheaters ruining the game.


Even if you actually believe this, Riot is not known for their high-quality code. This sounds a bit snarky but is entirely serious: Giving games root rights is bad enough, I absolutely don't want to run anything in kernel space from the same people who wrote the client for League of Legends.

And it's not even about trusting that Riot are not bad actors, tencent conspiracy nonsense aside, it's about leaving that trash running with that level of access in a way that some malicious process could use to elevate its permissions. That is the (ab)use case that worries me.


As someone who plays CSGO, I agree with you. I wish valve did something like this. I'm tired of matches getting ruined by cheaters, which happens very often.



Servers can distribute custom maps and assets to players so there's a mechanism to download files to a users computer.

As for uploads, players used to be able to set custom models in quake 2 which were distributed to other players on the server. Though I am not sure if that was done by server admins in special cases for clan payers or members or if there was an actual upload mechanism in the game engine.


> If this can be done with the source, it could have been done before, no?

This analysis is on-point, and something a lot of sources seem to miss. A determined actor can find the exact same exploits with and without access to the source code, though I admit it is much more complicated without ("determined").


The HL2 code wasn't leaked, it was stolen: https://www.eurogamer.net/articles/2011-02-21-the-boy-who-st...



Updated to remove AMP link above.


No, it's probably some random steam game downloaded an entire hard drive or network share. It's not like games installed via steam provide any kind of security or sandbox whatsoever. Every game you download and every library those games use could be downloading all your files, capturing your screen, logging your keyboard, scanning your network for vulnerable devices, etc...


Remember that, while available, it isn't legally so.

My advice is to avoid getting tainted. Do not read the code.

Of course, archivists, please do archive it. Even if Valve does never open source this, it should be possible to preserve somewhat adequately, and it should be legal to publish, at some point in the future, in some country or another.


Avoid getting tainted? What does that even mean.


Avoid reading the code, being inpired by it and using the same pattern (or worse, snippets) in your project (which could result in legal actions against you).

I have often heard it in the context of windows operating system developper which should be careful of not accidently introducing open-source code in the kernel if it might have a license that is not compatible with Microsoft's one.


There’s no law against that unless you’re implementing a patented algorithm, which is dumb in any case.


Alternative implementations of programs often use Clean Room reverse engineering, for a reason.


What if you copy paste code that is under a viral license ?


If you read the code and work in the same field, you may inadvertently implement some feature in a very similar way. That might lead to lawsuits.


That's why CodeWeavers, which partners with Valve, when hiring developers for Proton/WINE states as a requirement:

> No exposure to Microsoft code or reverse-engineering of Microsoft software

Ref https://www.codeweavers.com/about/jobs


usercheto21351 posted a magnet link that appears to be legitimate, but which is now [dead] for some reason. Do we really want to read speculative blogspam while rejecting the original material?


Any clues as to Half-Life 3?


What are the odds that this is a black hat move by Epic?


If the remote code execution thing in CS:GO is true I wonder if it could lead to virtual item theft. We're talking about potential loss of items worth of tens of thousands of dollars. I'm sure Valve could eventually recover them but that could be some serious anxiety and opportunity loss for item holders that are affected.


So far what i've send of the RCE i've seen so far has been a way of triggering pop-ups on the start screen - not to say its not more dangerous than that, but just thought i'd give some context.

Most RCE's aren't carte blanche to run arbitrary code on a users computer, but are some way of triggering a particular code path on a remote computer.


RCE by definition involves being able to run arbitrary code, for some reasonable definition of arbitrary. "Triggering a particular code path" doesn't get you anything: if you have a webpage you can trivially make your visitors' computers execute plenty of predictable code paths, like the one to render text to the screen or to send audio to the speakers.


Valve can unwind any item transaction that occurs within their marketplaces under fraudulent circumstances.


Thats why I said "I'm sure Valve could eventually recover them but that could be some serious anxiety and opportunity loss for item holders that are affected."


The fact that the servers didn't immediately get shut down is pretty irresponsible. There's tens of thousands of people logged in to tf2 who are at risk of having their computers pwned because the tf2 servers are still up.


Valve has no control over user-owned servers, only the listings for the server browser. Valve was informed of this leak years ago. All of this talk of an RCE is pure speculation; the leak hasn't even been out for a day.

It's good to be informed and take steps toward being safe, but we're talking about a leak where any meaningful security flaws have had multiple years to be patched.


Valve could shut down the servers they run but there are thousands of community servers they have no control over.

Valve could possibly kill the server browser service for TF2 to stop people searching for servers but then people could just connect directly to the community server of their choice either from their favourites or by IP directly.

They could push an update via steam which bricks the game completely but that would piss off a metric fuckton of the userbase when the game is still playable with some precautions in place (playing on password protected servers with people you know)

A shutdown as you are suggesting would only work with a game with published provided multiplayer servers and no community servers.

I know I for one don't want to live in a world with only publisher provided servers, those games regularly have the servers shut down because they are no longer profitable for the publisher/devs leaving any remaing community out in the wind.


There was (is?) also remote code execution on counter-strike 1.6 servers valve didn't act upon for 10+ years.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: