Hacker News new | past | comments | ask | show | jobs | submit login
Maxis Insider: SimCity Servers Not Necessary (rockpapershotgun.com)
271 points by amelim on March 12, 2013 | hide | past | favorite | 132 comments



"With the way that the game works, we offload a significant amount of the calculations to our servers so that the computations are off the local PCs and are moved into the cloud. It wouldn't be possible to make the game offline without a significant amount of engineering work by our team."

I'm going to point a finger and say that this is clearly untrue, and very easy to disprove just from basic network monitoring over 5 minutes of playing the game. I would immediately fire a systems architect that designed a single player game to compute significant calculations on our expensive [buzz word] cloud servers. Unless we use different definitions of the word significant...

It's a little ridiculous to suggest that?... plus, for zero benefit (above DRM), it would have a non-negligible affect on their bottom line if they are computing time cycles for SimCity on their own servers...

Anonymous source or not, conjecturally, it's hard to not agree with what the insider has said.


It's clear that the quoted text was a lie from the start.

There's nothing you could do in an EC2 instance that I couldn't do on my quad core i7 at many times the speed and a fraction of the cost. Even if you matched users one-for-one with large EC2 instances, you'd be looking at hundreds of thousands of dollars an hour (and that many don't exist).


Approaching the question from the opposite direction, it's also clear if you have any familiarity with the kinds of per-player CPU budgets that server-side games allocate. The norms for games like WoW aren't anywhere near enough to run even something close to the last-gen Sim City simulations, even though that's a 10-year-old game. Unless EA decided to allocate unprecedented levels of per-player server-side CPU, it's unlikely they're doing any significant percentage of the simulation on the server side.


You guys are all raising good points, but, at a higher level, did anyone really believe EA to begin with?


When Diablo 1 and 2 were launched, the single player was 100% client-side. As a result, there was rampant cheating. Jump online with your character, and you had an unfair advantage because of your infinite gold, stacked character and rare inventory.

Diablo 3 went always-online to help solve this problem. Loot discovery, inventory, and fighting outcomes are entirely controlled server-side. While it's possible that they could have forced separate online-only and offline characters, it's reasonable for them to have decided online-only for all characters and not duplicate the logic and engineering. Not to mention DRM.

With Sim City, it's conceivable that they went this way as well.


Nope nope nope.

I'm sorry but you are wrong here. Diablo 2, while it had online realms where you could bring your single player character, most people chose to play on the battle.net realms where all character info was stored online.

The servers where you could bring your single player character were just a show off for people who used character / item editing programs to create insanely stacked gear and characters, and like I said no one really played on those.

The reason Blizzard went online only with Diablo 3 and SC2, was because the Diablo 2's battle.net was reverse engineered, and there was an abundance of servers that could be played on with a fake CD Key all over the world. I remember specifically in Eastern Europe, we had quite a few servers and obviously with the average monthly salary being like less than $200, no one could afford to buy a PC game. Even LAN centers had cracked versions of all of the games and hosted their own servers.

see http://en.wikipedia.org/wiki/Bnetd

Anyway, my point is that you cannot have locally stored game information that can be imported in an online realm and have a direct impact. People will edit that information to create whatever they want. But if a game requires you to be online, it must be an online game period. If a game can be played offline, there is no reason whatsoever for it to have to force you to use online authentication in order to play in a local environment.


> The reason Blizzard went online only with Diablo 3 and SC2, was because the Diablo 2's battle.net was reverse engineered, and there was an abundance of servers that could be played on with a fake CD Key all over the world.

I don't think that's the only reason. Diablo II is plagued with bots and duping. While the balance between client-server data on the Battle.net closed realms is much better than it was with Diablo (where you could just edit your character data locally to increase gold/upgrade inventory online), Diablo II still has the problems of 1) loading the entire level map into memory at once, giving bots the opportunity to path their way to POIs with no effort and 2) reconciling local inventory with the server's inventory after lag spikes and server crashes, which is hypothesized to be the main method dupers use.

But, as could be expected, both botting and duping happen on D3 anyway. And, to your point, during the D3 beta period, there were several devs that were able to reverse-engineer the D3 protocol anyway and create a local server. shrug


In addition to your point, Diablo 3 also introduced a real-money auction system, which necessitated a need for far deeper control over inventory etc. Online-only for such a system is a fairly obvious choice.


I think that there were a lot of players that were frustrated when they made a character offline, and then got invited to play with a friend on battle.net and had to start from scratch. I assume that frustration was part of the reason (among others) that they made all characters online/battle.net characters.


Thanks for clarifying.


Single player cheating a.k.a "God Mode" was considered a feature for early versions of SimCity. Multiplayer requires a network connection either way. As long as multiplayer is optional, cheating issues should not be an impediment to single player.

There are gamers who prefer to play with unlimited resources and complete control of the situation. Many prefer a sandbox where other gamers cannot mess with their experience. Are you telling me Maxis is in the business of getting between the consumer and their game? That's a losing business proposition if it's true.


I assumed they where talking about the 'world' economy and if that's the case it may be both true and irrelevant.

Ex: ~20,000 player city's are uploaded into a model. They do a simple calculation based on excess energy, pollution, ect. The result's of that are fed back down and then they run the model again adjusting for new city's and client city updates. Now even if 10mhz per city is used your talking about a 200GHz worth of processing which is far more than an i7 but shared and mostly irrelevant in single player as you could just as easily fake the global numbers.


While I agree that this is likely not truthful, I do wonder if part of the strategy of offloading more work to the cloud be to make an easier transition to allowing mobile clients (iOS/Android) where local processing capacity is more of an issue, or to platform-neutral approaches, or to a "take your city anywhere" play model? (I doubt this is the case because unless they have a motivation to keep it secret, it would make for a much better explanation for why they'd want to keep state in the cloud, given the trends towards more mobile gaming.)


Even if you want to offload to the cloud for mobile you'd still only want to do it for the clients which need it.


EA's claim struck me as really odd too; this just isn't the way that games are made [yet?] and if they did manage to get something like they claimed running, there would be much more interesting technical aspects of it that they probably would have done press releases for. In this day and age, standing up a system like that is still a major accomplishment, and the details of how they got around things like processing power and network bandwidth on consumer connections would be really interesting to the tech community.

TL;DR easily-verifiable claim by EA that stunk from the beginning proven wrong by anyone who knows how to use Wireshark

Que even worse consumer backlash at EA.


Forget Wireshark, the article implies that some of the critics tried the good old fashioned "yank the Internet cord and see how long it takes to break" method and got 20 minutes of playability.


you can only offload processing that has barely any requirement on latency. If you send of loot or hit calculations (like d3) onto a cloudy server and it takes minutes for it to finish, the game would become unplayable.


Maybe not the case for SimCity, but for Diablo 3 a lot of game calculations are done server side to prevent hacks in the item drop rate and item duplication. Because Diablo 3 has a real money economy, it's crucial to ensure that the items in the game are authentic and not acquired through mods/trainers/cheat programs.

Anyway, offloading processing to the server does have its benefit and uses, just maybe not the case here with SimCity (though I'm not sure about this having no experience with the series)


It's clear that it's a lie because if it were true it would've been delivered in a form of an awesome tech demo and not an excuse for broken game.

That said however...

If there's a shared world, then running its simulation server-side makes sense. Not the game minutiae, but global state. Something like weather, simulated stock markets, etc. The environment, basically. That's not to say that SimCity has any of this, because it doesn't.


One can argue that not everyone has an i7, and offloading work to the server enables it to run on weaker computers.


One can also argue that not everyone has google fiber, and running locally enables it to run when internet access is slow or unreliable


What reasonable company would justify spending tens of thousands an hour on server costs, when they could optimise their code a little more and run it for free?


You do know what c-suite we are discussing, correct? ;)


The richest games company makes decisions which benefit them...


So let's assume that the average person wanting to play SimCity is running a Core2Duo (released 7 years ago). It's hard to find benchmarks directly comparing a Core2 E6600 to something like an Ivy Bridge Xeon that you'd expect to find in a modern dual-socket 1U server, but even looking at a TomsHardware chart of x86 core performance can tell you that an i7-2600k is only about twice as powerful as a Pentium 4 HT660 (core-for-core) http://www.tomshardware.com/charts/x86-core-performance-comp...

Bottom line, the total cost of ownership doesn't at all make business sense to do right now. In 10 years, it very well might.

See also: https://gist.github.com/jboner/2841832 ("Latency Numbers Every Programmer Should Know"). TL;DR

Main memory reference 100 ns 20x L2 cache, 200x L1 cache vs Send packet CA->Netherlands->CA 150,000,000 ns 150 ms



The core-for-core thing makes a huge difference in the real world that doesn't show up on single-core benchmarks. The HT660 was a good chip in its day, but it's at a four-to-one disadvantage for code that's multithreaded and/or running on a busy PC.

I think Sandy Bridge is my favorite CPU of all time. It does a truly massive amount of work without consuming significantly more power than the part it replaced.


In other news, the Pope is Catholic...


Right now (checks the news) this is technically false.


Not any more. Your idioms are once again safe.


Technically, the Antipope is Catholic. Inconsistent systems prove everything.


I'm on your side, I think this is stupid, but I can't let this stand:

> I'm going to point a finger and say that this is clearly untrue, and very easy to disprove just from basic network monitoring over 5 minutes of playing the game.

c'mon. The amount of computation done can be completely uncorrelated to the number of bits sent over the network, in the same way that the effects of your saying something offhand to someone could have a massive effect on the final state of the planet.


If you send a small amount of bits over the network, the amount of responses you can get is commensurately small. So if you aren't sending a lot of bits, it would be possible to simply have a local mapping of inputs to outputs. There is a certain size the message has to have for it to be worth it to send it instead of solving the problem locally; obviously, I don't know what that size is or whether SimCity's packages where smaller than that size. But the claim of the OP isn't totally absurd.


> If you send a small amount of bits over the network, the amount of responses you can get is commensurately small.

Oh, come on! You can entirely specify a cosmically hard problem in just a few kB. Prime factorization, anyone? Use discrete logarithms in finite fields, and you get down to handfuls of bytes.

Your conclusion is probably right, but your theoretical basis for it leaves a lot to be desired.


If a response is too small to describe the changes to the state of your city at that time, clearly it's a thick client which mostly knows how to simulate the city.


That sounds much better -- less like some pseudo information theory or complexity nonsense.

That said, I would put some small calculations on a server if I thought cheating was an issue. This wouldn't necessarily apply to single player games, though.


You don't have to transmit many bits to perform an internet search.

When you perform an internet search are you all by yourself consuming more computing resources than the i5 processor in your computer? That's unlikely, but you'd also have a difficult time replicating the functionality of Google with your CPU alone and only the storage on your own laptop.


Google is matching your query against a humongous database. In Maxis' case, the only relevant data is some aggregate data from the other cities in your region.


True, Maxis probably isn't doing much computation. rz2k is simply pointing out that Fargren's argument is incorrect. Same as what stcredzero said, "Your conclusion is probably right, but your theoretical basis for it leaves a lot to be desired."


People have been able to play if offline for a limited time. This just makes it look worse for EA's claims.

http://kotaku.com/5990165/my-simcity-city-thrived-offline-fo...


You can apparently play for even longer if you change your computers clock.


And now knowing this, we'll have a working fake DRM server in 3...2....1...

(because I'm sure that crackers took until now to figure out that it only needed the EA server for saved game storage and DRM verification)


How hard would it be to stick a proxy between the game and the servers (on the players machine), capture some data, and configure the proxy to listen for message 'X' from the game and return response 'Y'.

If that could be done (by someone with far greater skills that myself, I can do it for Web dev but not this) fairly easily and the game play went on for several hours would that not put a very big hole in EA's argument?

I mean other than the one already sitting there...


I tried that days ago with Wireshark. SimCity uses SSL for server communication and it has hard coded certificates - it does not use the OS SSL certificates. This prevents you from using a self signed cert to decrypt the data, at least without complicated patching of the game exe.


How about catching the packets before SSL? I have no knowledge about modern Windows debugging, or how Simcity might block a debugger. But I guess you could pinpoint the location of the messages just before SSL encryption, and just dump them out?


That sounds plausible. Again, I'm not a windows guy, but unless they've statically linked the SSL libraries, you should just be able to inject your own dll and capture the data on the way into the library.


I would think that they have statically linked it, which is why I thought about using a debugger to catch the data. With dynamically linked library, such as OpenSSL, it would be quite easy to capture the data.


You could just search for the certificates in the code, and update them with your own.


But haven't they been rushing updates out the door? They may have been careless given the situation.


Presumably, it would be as hard as breaking the DRM scheme that they're using to prevent people from doing that to play illicitly shared copies in offline-only mode.


There is precedent for this, it's how the Assassin's Creed II DRM was bypassed. (And simple versions of it were done quickly)


There are several games and simulators which send messages to the server for processing and stream video/images back from the server. A game like this functions as a terminal of sorts. It allows the servers to farm out the complex calculations/rendering.

Also, many network multiplayer games rely on a server for calculations like timing, collision detection, bullet-hits, race position, and the like.


It doesn't apply to sim city. Sim city is not a real time game, you can even pause and accelerate time, so you don't ned servers to synchronize position or detect collisions with objects rendered on other machies. and the requirements make it clear that rendering is done on client side.



The thing I don't understand about all this is why they were unable to new servers online. After everything hit the fan, they announced they were working on it and had gotten two new servers up the day before.

Two servers?

It just seems unbelievable to me that the backend was designed in a way that it couldn't be scaled out any faster than that. Since each region is a discrete unit, you'd think they should be able to move them between servers.

Was it all intertwined? Did the regions, stats, achievements, and DRM all run out of the same database? Were they not separate services?

They had to know this game would be popular, they've been pushing it for months (to great effect). It's a major property and the first release in about a decade.

Then there is EA. Even if Maxis couldn't figure this out (and I doubt that), EA has online experience. They're the publisher for Mass Effect, Madden, Fifa, NCAA, and more. They should have the resources, the people, and the experience to have prevented this.

If you completely ignore the DRM or the seemingly unimportant always-online requirement, it this whole thing still seems botched. There were multiple groups who should have known better and prevented this. My understanding is that they got some warning signs during the beta.

I would kill for a postmortem blog or article on Gamasutra explaining why they couldn't scale out faster; to know what decision was the lynchpin that held them back.


Each game "Server" for SimCity is actually an Amazon EC2 cluster of servers, with 1 central master DB server. Even when the servers were "full" on game launch, all of the EC2 servers were responding to requests normally - it was the cluster's master DB server that was slow. All of the "servers" are actually in the UK Amazon EC2.

This brings us to the scalability problems and why regions/cities are not shared to all servers. The database is the bottleneck, so sharing regions between servers would only worsen performance.


That makes it even more baffling why they couldn't bring up more servers.

If it's all just a chef/puppet based infrastructure in EC2, you should be maybe 20-30 minutes away from pumping out a new 'server'. One is as easy as ten, at that point.


We're talking about EA here. You need to include the latency required to go through enough bureaucracy layers to approve the expenditure of funds on another cluster.


Not to say that there isn't bureaucracy in a company their size, but the launch of a game this size is a really big deal. There's a tremendous of PR, and they're getting scathed. Polygon downgraded their review of 9.5 down to 4.0 because of the server issues. I think if they could cut a decent size check to fix the issues, they would. The problem is likely in engineering, like a database that isn't scaling.


> The problem is likely in engineering, like a database that isn't scaling.

If it were that simple, just cut the number of users per cluster and throw 10 more up.

> I think if they could cut a decent size check to fix the issues

Doubtful within the context of a quick fix, but it is likely the root issue. See above simple solution that takes 30 minutes to roll out. EA is not a company run by engineers, its not a company run by people that understand anything about engineering. What sounds like a simple solution to us that can easily be implemented by throwing money at it and reaping the customer goodwill is completely foreign to a company like that. You may as well be speaking Klingon when you make the recommendation to just throw new clusters at it.


I've met enough people who worked at EA to know they have competent engineers and managers -- they are not completely inept. When hundreds of millions of dollars are suddenly at risk, you have a clear channel straight to the CEO to get the resources that you need.

I'm giving EA the benefit of the doubt that they ruled out 30-minute fixes. I can't see how any of us can really speculate as to how long it should take to fix when we don't really know any details. For example, if it was a database bottleneck, would you commit to walking in and fixing it in 30 minutes? Or even 30 hours? I think you'd want to know the details, because the scope can easily be off by 1-2 orders of magnitude.


> I've met enough people who worked at EA to know they have competent engineers and managers -- they are not completely inept.

Nope, nope, nope. Nobody in their right mind will believe you that. If they fabricate such a game launch then they are complete morons. You simply can not claim that they are competent after that clusterfk.

Under "competent" I understand "Having sufficient skill, knowledge, ability, or qualifications." (definition from wiktionary). Their engineers clearly have neither the skill nor ability to fix the problems in timely manner. Nor did (or do) they have enough knowledge how the servers behave under such high load, otherwise they'd fix it before the launch. And if the problems were known, then the managers failed to delay the launch. A company which knowingly releases such a game is not run by competent people.


I believe it. All it takes is one higher up executive to pick a launch date and mandate that it happens, damn the torpedoes. This says nothing about the competence of the engineers and their managers in the trenches.

I mean, you could question their willingness to work at the company. But, working on the latest sim city sounds like a fun project and people like paychecks.


Bingo! As a former EA senior engineer, my money is on someone in EA marketing setting the dates and forcing an 80-hour week crunch time for the duration of this title's production. They probably even brought in "experts" who ignored foundation code and slammed in their own logic "'cause it works" and now they have a clusterfuck of logic no one understands. Been there, been screwed by their managers, and left for good. EA can rot in hell.


Databases many times are the bottlenecks of infrastructure. You can't just spin up new instances easily to fix the problem. Many times you have to completely re-architect your database schema and architecture to handle to increased demand .


a DB bottleneck and lousy regionalization seem like exactly the kind of thing EA's experience in this area would prevent. What gives?


While I don't know for certain, I can't possibly imagine that they've just added two physical machines. Small websites operate with more servers than that, let alone a AAA title from a major studio. I don't think they literally mean two servers - it sounds like they're trying to make the explanation simpler for consumers. What they're probably referring to is a self-contained cluster of machines that runs all of the necessary services that the game relies on - databases, notifications, social rankings, etc.

That being said, all of this wouldn't be needed if they'd just release an offline mode. The upcoming DLC content to unlock expected features (bigger cities, more transportation options, etc) is bad enough, but the always on requirement just makes this game impossible for me to buy.


"While I don't know for certain, I can't possibly imagine that they've just added two physical machines."

I can totally imagine that. Be afraid. Be very afraid. :)


I've actually seen entire companies and universities run on two physical machines, for sufficiently large values of two physical machines.

When I was working there, the University of Oregon ran almost all of the routine computation and webpages on one big Sun box that was about as fast as my calculator, and one enormous Vax box that was about 8086 level speed.


Your explanation sounds more plausible. It could be more akin to Blizzard adding 2 new realms to their realm list, rather than 2 physical computers.


A friend of mine put it this way: let people run disconnected, offline, and in the UI show a big glass dome over their city cutting them off Simpsons Movie style[1]. Over time, gas, food, and water run out, necessitating a reconnection.

[1] https://www.youtube.com/watch?v=1kPUGLt1DWQ


In respect to the graphics, it looks awesome and I was expecting to be just as great as previous iterations. I actually wanted to play the game until I found out it was social multiplayer with cloud-based saves.

That just killed it. I have since bought a copy of SimCity 4 and plan to be quite happy with it for another decade or so.


> That just killed it.

Really? Not the terrible launch?

This says nothing about the game, only your preference of playing single player games over multiplayer games. There are plenty of great games out there with 'social multiplayer' or 'cloud-based saves'. These are not problems that drive away players.


Ostensibly they are, if they are the root of EA's issues. I don't complain about the servers being down because I don't play games that require servers' constant attention like that. If they were any other than 'social multiplayer' then they would be a real-time strategy game, and oddly while those games tend to sink or swim on the balance, playability, and fun of their multiplayer side, I too generally prefer to play single player on those games as well as I would tend to get my ass handed to me by Koreans.

I would rather play Starcraft today than buy SC2. OK so that is stretching, but I haven't tried the new StarCraft, and it's mostly because of this DRM situation.


OK. Today I went out and tried Jurassic Park Builder which is basically Sim City Lite for Android.

I am frustrated because I can't start a new game without destroying my current progress. I took a slow route, spent all of my coins trying to evolve my Triceratops too soon, and now I can't use it to progress faster. But this is only the first day.

It's not as bad as Vector, that makes it clear I won't be able to complete the game without a purchase. They could be lying. It might be worth the bling just to own the special tricks. Meanwhile I have come up with some great ideas for how to use the bitcoin in new and exciting ways. I'm drunk!


I'm not sure how fun "a limited single-player game without all the nifty region stuff" would be, considering how small each city is. A lot of the fun is playing at the region level.


Don't worry.

I'm sure Sim City "Megatropolis" DLC is soon coming that doubles the default city size. Only $19.95 or whatever they think people will pay.

And don't worry about the traffic jams from bigger cities. Sim City Subway edition will comes out soon after. Only $15.95.

(yes I am very cynical and angry about this and making bitter sarcastic comments. This is a game I've wanted for a long time but is so flawed because of bad decisions)


I'll wait till I get the "Gold Edition" or whatever that includes all these expansion packs and the original game for the same price as the game costs now, a year down the road.


I think you might be waiting in vain. Unlike most other franchises, The Sims in particular seem to have an extraordinarily long half-life when it comes to DLC cost. I'd be rather surprised if Sim City was any different, given how well it's sold despite terrible PR work.


I'd love that too, but have you seen Sims 3 and its DLC? One of the reasons I cancelled my pre-order was because I realized that EA would undoubtedly milk SimCity customers for as much as they could. I wanted no part of that.


I suspect it would be at least slightly more fun than not being able to play at all.


Considering that about 90% of the gameplay comes from interacting with the other cities in the region, I really don't think this would work out as well as people think it would. The cities are small enough that you can't supply all the power/water/garbage/people/industry out of one town, and it's the EA servers that are maintaining the connections between them.


Given the small size, the typical request is to be able to play/control an entire region offline.


The question would be whether a single PC has enough computing power to run the entire region. It looks like the new SimCity engine actually simulate behavior of each car/person and resource transportation. Would simulating all the agents of the entire region works?


I don't know if that argument works economically. EA has to support something like 100,000 simultaneous users, so if the server side needs more than one PC worth per user then their EC2 bill would be off the charts. (I see nwh made this argument earlier.) I suspect that inactive cities in a region are simulated with a lower level of detail.


Since players can run at different speeds, I'm going to say that inactive cities are simply not run and exist in a paused state. They probably produce/consume resources at a steady rate from their neighbouring cities, based on a statistical average.


EA does not have to simulate the cities now, each cities are already simulated/view only by the owner of that cities. EA only have to relay information between cities.

But they would have to do it if you can actually control other cities in the regions. If you play one city and then switch to another city for, say, 1 game year, what's going to happen to your previous city? Does its time stop? Would AI have to take good care of all cities?


It doesn't seem like the game is that taxing on decently good processors. It seems like you should easily be able to do those calculations if you have some extra cores lying around.

One thing to keep in mind is that even in big, 16 city regions, only subregions of 4 cities each are actually connected by roads and rail. In fact, these subregions are practically autonomous, except maybe for air travel (I don't know). This is why you have 4 great works per 16 city region. So first thing to do would be, if there are actual calculations taking place between these subregions, cut that off.

Now we only have the interconnections of 4 cities to deal with. Only 1 of these cities is 'active' at a time (the one you're currently playing on). I don't see why you couldn't do a rougher simulation of what is happening between your active city and the three other cities. You don't have to track agents from other cities to the active one, just have a counter that keeps track of total population.

I may just be a naïve computer science student, but this doesn't seem terribly hard to do if you already have everything else in place. And it couldn't possibly be slower than what they currently have: I've watched one streamer have 3 cities within a subregion all have different values for progress on a great work. They weren't synced up at all. It was just silly. On one local machine, there's no way this could happen.


Or perhaps the city sizes are kept small due to the "connecting" stuff they're doing, and stand-alone cities could be larger if they didn't need to live online.


The agent-based simulation is probably much more CPU expensive than the statistical model in older SimCitys, which might also have informed the size limit.


It really seems like this is the point they decided that social play is core to the gameplay. I imagine the conversation is something like 'how hard would it be to implement regions locally?' 'a lot of work' which gets translated into nothing works locally to the press.


Patiently waiting for the re-implemented game server and attendant crack that will let us point SimCity to our own networks.


Yep. I assume that the game has fallbacks for "failed save" and "bad data from server" that you could exploit / patch / use in a work-around here. Worst-case, you'd make a real server that accepts save game data and stores it, and maybe does some "verification" of client data packages (AKA "discard client response and return a big thumbs up in all cases"). It might be more complicated than that if they're implementing a challenge / response, but I have no idea how much effort they're going through for DRM these days when it's been shown to NEVER work.


I've done some reversing, and this is EA, so... let's say it's not going to be something that'll be cracked in a weekend.


I could be wrong here, but if Bradshaw's claims were true--that a significant amount of calculation is offloaded on their servers--wouldn't the game be unplayable for people who have shitty internet connections anyway? It's easy for me to imagine SimCity making so many computations to simulate the game, but for those data to be sent back and forth from the gamer's PC to their servers? It has got to strain the player somehow.


Your local machine could be doing coarse grain calculations while their server did fine grain calculations, and periodically sync up, in the same way FPS's do (which is why you get laggy players jumping about).

But from what others have said, that doesn't seem to be the case.


Wow people must want to play this game so bad seeing the amount of aggro it's generated. Makes me want to go out and buy a copy!


I'm pretty much waiting for the "offline" release, the 100% all clear that noone is having issues, or the crack. whichever comes first.


I've been waiting for years and i'll wait until the day there is an offline mode or a patch. It's the principle.


Yup. I bought one copy because of all the rage!


So, if it's not much engineering effort for EA to make the game single player it shouldn't be too much engineering effort for people who are not EA..


I'm pretty sure they reused lots of Sims 1-3 code, and they really have this simulation thingy figured out. Most probably some really good and generic engine, or else they could not pump out Sims extensions every other month. I have quite some respect for this, it certainly took years.


> I'm pretty sure they reused lots of Sims 1-3 code

You made this assertion elsewhere in this thread also. I don't see any reason to believe this is the case, do you have any evidence in favor of this theory? There's a huge amount of PR from them saying this is not the case, and from my minimal experience with game development it makes no sense.


The same amount of PR of them saying that you need to be online because servers manage these "massive" simulations. No, I have no evidence, would be cool if someone from Maxis explained the differences between Sims and Glassbox. What they say here [1] also applies to Sims, only on a bigger scale. I have yet to see a feature that does not somehow exist in Sims.

[1]: http://www.polygon.com/gaming/2012/7/4/3135209/simcity-reboo...


They didn't. It uses a new engine, Glassbox.

There's a difference between Sims and Sim City.


What makes you think SimCity reuses code from Sims 1-3?


Given that Sims 3 runs on my old Windows machine (on low graphics settings), it's not that surprising that servers are not actually needed to run SimCity. I don't know if they really use the same simulation engine, but I'd say the scale is on the same order of magnitude.


They don't. SimCity runs on a new engine called Glassbox. They've been releasing videos and articles about it for months to get people pumped up.


That's what they say, but I don't buy the marketing speak, especially now that they have been proven liars. Would you throw away all this code and start from scratch? What they describe about Glassbox was already used in Sims - agents, resources,... except multiplayer which seems to be just an occasional sync, right?


Anyone notice the interesting old-school-text-adventure error page? It wasn't 404 but some other 4xx page (forgot which) -- it seems to be resolved and gone now though.


People are complaining as if DRM was the reason they made it like this, EA is evil, yada-yada. It isn't. It's pretty clear they had a vision for this game to be a MMO/social game from day one.

The problem stems more from their failure at scaling than not supporting an offline mode. Diablo was the same crap on launch week.

There's an apparent pressure from publishers to create always-online games, but something tells me the teams are not having the experience/schedule/man-power to create scalable architectures to back it. Massive multiplayer gaming is certainly a hard problem.


> People are complaining as if DRM was the reason they made it like this, EA is evil, yada-yada. It isn't. It's pretty clear they had a vision for this game to be a MMO/social game from day one.

These two statements don't contradict each other.

People are complaining because EA changed the fundamental premise of the game from single-player to multi-player, and (people think) that change was driven by business goals and not from design goals. "DMR" is no longer a strictly accurate term, because it's now a design principle rather than a technology, but that's exactly what has people so riled up. They perceive that the principles of DRM are working "up the stack," so to speak, and are now infecting not just the game technology, but the game vision as well, making it a more insidious and existential threat than it was previously.

The decision to shift a fundamentally single-player franchise in a fundamentally multi-player direction is questionable at best. While it may not be so black-and-white as EA handing down a mandate, and I'm certain that business interests at least influenced the decision.


If their vision was for this game to be an MMO/social game "from day one", then they should have done a way better job sharing that vision and framing the discussion that way.

The social/multiplayer aspects of Diablo, WoW, and SC are obvious to me, which is why Blizzard's move never really concerned me that much. Add on top of that the truly well done matchmaking system and I hardly think about it all. Doing things well goes a long way in convincing naysayers.

However, when I think SimCity, all I can think of is single player. And the vague things I've heard about interacting with others seem really lame. Again, maybe there are some awesome multiplayer aspects, but I have not heard of them and it sounds like most people aren't that interested in them.


they should have done a way better job sharing that vision and framing the discussion that way.

In a market where some game's design decisions for online focus and/or online passes are driven by an effort to reduce piracy and the secondary sales market, it becomes very hard to convince people that any one game isn't.

Good luck selling that: "No, not in this case, our design just happened to support them by coincidence, and, oh, we're also part of EA and even though the cynics have been right about our motives in the past, we're totally not doing it this time."


There's an apparent pressure from publishers to create always-online games

Maybe for something like Diablo 3 but there's a level of solo play that defines a game like Sim City.


If it isn't about DRM and is in fact about the MMO/Social aspects, then why are EA and Maxis at pains to describe it as something else?


Duh?

The top comment here is more elegant, but EA is clearly lying. Their "apology" was a joke and amounted to "Sorry we've had so much success, it's your fault for assaulting our servers."


Although I'm not thrilled with the launch experience and DRM of SimCity, I don't think that this article adds any validity due to everything coming from an anonymous source.


He isn't anonymous to RPS - they say they know who he is, and verified he worked on the project just aren't publishing his name. That's a lot different than an "anonymous tip."

"Our source, who we have verified worked directly on the project but obviously wishes to remain anonymous, has first-hand knowledge of how the game works."

Either RPS is being duped, or it strongly adds validity to the story. RPS has a great reputation, and if they say they verified it, I think that puts the burden of proof now on EA to show that that the claim is incorrect.


It's possible RPS isn't known by many on HN. Gaming journalism doesn't often make its way to this place.


I think the anonymous source perfectly illustrates why other players have been able to handle playing offline for up to 20 minutes without any hiccups in the simulation. If indeed the server is simply processing the messages between cities, it makes sense that they have a time constraint on how long a city can operate without hearing from other cities in the region.


OK, so get out Wireshark and have a peek for yourself. Figure out how to decode the data in the responses returned from the server (I'm sure it's not much more than maybe binary packages of metadata transmitted over SSL) and examine it until you're confident.

http://ask.wireshark.org/questions/16788/wireshark-decrypt-s... http://computer-forensics.sans.org/blog/2009/03/10/pulling-b... should help get you started.

"The great thing about facts is that they're true whether you believe them or not"

Also, Notch has "confirmed" this (if you trust him more than a sketchy anon source that RockPaperShotgun has stated that they've verified) https://twitter.com/notch/status/311535572596432896


Thanks for the links. I wasn't clear. I didn't mean to imply that what the source was saying is not possible. Since I wasn't familiar with the credibility of RPS (which I now hear is very credible), I just felt like reading the article didn't tell me anything concrete.


RPS is generally credible.


As usual with 'anonymous sources/insiders' it's all complete conjecture. The only people that truly know are EA/Maxis.

EDIT: Seems people disagree that this isn't complete guesswork. Yes the game runs for a while without a connection, but that doesn't prove that servers aren't necessary as the headline claims. There is no hard, undeniable evidence. It's the same story that has been rehashed since the game was released.

And since the Xbox 720 'leak' fiasco I take these 'sources' with a pinch of salt, regardless of how reputable the site is.

http://www.gamefront.com/tech-sites-fall-for-fake-xbox-720-l...


This isn't really conjecture. There are firm claims here (e.g. an inside source provided this information, players have been able to play without a connection to the server) that are either true or false. It may not currently be within our power to prove all of them one way or the other, but that does not make it conjecture.


Well, it's plausible seeing as how multiple people have reported to be able to play the game while lacking internet connections for up to 20 minutes. Additionally, the methodology of the code seems to fit with several of the changes made by Maxis to improve performance (disabling cheetah speed for example).

Regardless, Maxis has stated they will respond shortly to the article. https://twitter.com/rockpapershot/status/311618456640450561


The main thing the article proves is that the servers aren't needed for any real time calculations. What's to say the servers don't provide valuable data based on previous events in the game, data that's needed for future simulation?


A guy with an axe to grind tricked some websites that either aren't focused on games or are just desperate for page views? That's not going to change my view of Rock Paper Shotgun's editorial chops.

Maybe you're right, I'll wait for the Maxis PR statement, that's where the truth will lie!

Those with an editorial reputation in the gaming press are (perhaps even uncomfortably) close to the guys in the industry they cover. They knew better than to run with that hoax, and Rob Crossly even called the hoaxer out on it (read the emails he posted to twitter if you want to see the hoxer have a hissy fit).

If RPS says they got a Maxis duder, they probably got one. One has to be suspicions of all reporting, but even though the NYTimes fucks the pooch sometimes, its not really grounds to handwave away every bit of reporting ever.


"the truth will lie"

This is probably accurate.


Well, you have to choose your level of trust. I don't think RPS would post this unless they were reasonably sure. "Complete conjecture" isn't really an accurate description, I think, although "unconfirmed and incomplete" is certainly something you could say.


To get an actual bayesian estimate, you'd need to check the reliability of the site (RPS) with respect to past "leaks" being correct or not. A single incorrect leak does not necessarily completely change the priors.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: