"With the way that the game works, we offload a significant amount of the calculations to our servers so that the computations are off the local PCs and are moved into the cloud. It wouldn't be possible to make the game offline without a significant amount of engineering work by our team."
I'm going to point a finger and say that this is clearly untrue, and very easy to disprove just from basic network monitoring over 5 minutes of playing the game. I would immediately fire a systems architect that designed a single player game to compute significant calculations on our expensive [buzz word] cloud servers. Unless we use different definitions of the word significant...
It's a little ridiculous to suggest that?... plus, for zero benefit (above DRM), it would have a non-negligible affect on their bottom line if they are computing time cycles for SimCity on their own servers...
Anonymous source or not, conjecturally, it's hard to not agree with what the insider has said.
It's clear that the quoted text was a lie from the start.
There's nothing you could do in an EC2 instance that I couldn't do on my quad core i7 at many times the speed and a fraction of the cost. Even if you matched users one-for-one with large EC2 instances, you'd be looking at hundreds of thousands of dollars an hour (and that many don't exist).
Approaching the question from the opposite direction, it's also clear if you have any familiarity with the kinds of per-player CPU budgets that server-side games allocate. The norms for games like WoW aren't anywhere near enough to run even something close to the last-gen Sim City simulations, even though that's a 10-year-old game. Unless EA decided to allocate unprecedented levels of per-player server-side CPU, it's unlikely they're doing any significant percentage of the simulation on the server side.
When Diablo 1 and 2 were launched, the single player was 100% client-side. As a result, there was rampant cheating. Jump online with your character, and you had an unfair advantage because of your infinite gold, stacked character and rare inventory.
Diablo 3 went always-online to help solve this problem. Loot discovery, inventory, and fighting outcomes are entirely controlled server-side. While it's possible that they could have forced separate online-only and offline characters, it's reasonable for them to have decided online-only for all characters and not duplicate the logic and engineering. Not to mention DRM.
With Sim City, it's conceivable that they went this way as well.
I'm sorry but you are wrong here. Diablo 2, while it had online realms where you could bring your single player character, most people chose to play on the battle.net realms where all character info was stored online.
The servers where you could bring your single player character were just a show off for people who used character / item editing programs to create insanely stacked gear and characters, and like I said no one really played on those.
The reason Blizzard went online only with Diablo 3 and SC2, was because the Diablo 2's battle.net was reverse engineered, and there was an abundance of servers that could be played on with a fake CD Key all over the world. I remember specifically in Eastern Europe, we had quite a few servers and obviously with the average monthly salary being like less than $200, no one could afford to buy a PC game. Even LAN centers had cracked versions of all of the games and hosted their own servers.
Anyway, my point is that you cannot have locally stored game information that can be imported in an online realm and have a direct impact. People will edit that information to create whatever they want. But if a game requires you to be online, it must be an online game period. If a game can be played offline, there is no reason whatsoever for it to have to force you to use online authentication in order to play in a local environment.
> The reason Blizzard went online only with Diablo 3 and SC2, was because the Diablo 2's battle.net was reverse engineered, and there was an abundance of servers that could be played on with a fake CD Key all over the world.
I don't think that's the only reason. Diablo II is plagued with bots and duping. While the balance between client-server data on the Battle.net closed realms is much better than it was with Diablo (where you could just edit your character data locally to increase gold/upgrade inventory online), Diablo II still has the problems of 1) loading the entire level map into memory at once, giving bots the opportunity to path their way to POIs with no effort and 2) reconciling local inventory with the server's inventory after lag spikes and server crashes, which is hypothesized to be the main method dupers use.
But, as could be expected, both botting and duping happen on D3 anyway. And, to your point, during the D3 beta period, there were several devs that were able to reverse-engineer the D3 protocol anyway and create a local server. shrug
In addition to your point, Diablo 3 also introduced a real-money auction system, which necessitated a need for far deeper control over inventory etc. Online-only for such a system is a fairly obvious choice.
I think that there were a lot of players that were frustrated when they made a character offline, and then got invited to play with a friend on battle.net and had to start from scratch. I assume that frustration was part of the reason (among others) that they made all characters online/battle.net characters.
Single player cheating a.k.a "God Mode" was considered a feature for early versions of SimCity. Multiplayer requires a network connection either way. As long as multiplayer is optional, cheating issues should not be an impediment to single player.
There are gamers who prefer to play with unlimited resources and complete control of the situation. Many prefer a sandbox where other gamers cannot mess with their experience. Are you telling me Maxis is in the business of getting between the consumer and their game? That's a losing business proposition if it's true.
I assumed they where talking about the 'world' economy and if that's the case it may be both true and irrelevant.
Ex: ~20,000 player city's are uploaded into a model. They do a simple calculation based on excess energy, pollution, ect. The result's of that are fed back down and then they run the model again adjusting for new city's and client city updates. Now even if 10mhz per city is used your talking about a 200GHz worth of processing which is far more than an i7 but shared and mostly irrelevant in single player as you could just as easily fake the global numbers.
While I agree that this is likely not truthful, I do wonder if part of the strategy of offloading more work to the cloud be to make an easier transition to allowing mobile clients (iOS/Android) where local processing capacity is more of an issue, or to platform-neutral approaches, or to a "take your city anywhere" play model? (I doubt this is the case because unless they have a motivation to keep it secret, it would make for a much better explanation for why they'd want to keep state in the cloud, given the trends towards more mobile gaming.)
EA's claim struck me as really odd too; this just isn't the way that games are made [yet?] and if they did manage to get something like they claimed running, there would be much more interesting technical aspects of it that they probably would have done press releases for. In this day and age, standing up a system like that is still a major accomplishment, and the details of how they got around things like processing power and network bandwidth on consumer connections would be really interesting to the tech community.
TL;DR easily-verifiable claim by EA that stunk from the beginning proven wrong by anyone who knows how to use Wireshark
Forget Wireshark, the article implies that some of the critics tried the good old fashioned "yank the Internet cord and see how long it takes to break" method and got 20 minutes of playability.
you can only offload processing that has barely any requirement on latency. If you send of loot or hit calculations (like d3) onto a cloudy server and it takes minutes for it to finish, the game would become unplayable.
Maybe not the case for SimCity, but for Diablo 3 a lot of game calculations are done server side to prevent hacks in the item drop rate and item duplication. Because Diablo 3 has a real money economy, it's crucial to ensure that the items in the game are authentic and not acquired through mods/trainers/cheat programs.
Anyway, offloading processing to the server does have its benefit and uses, just maybe not the case here with SimCity (though I'm not sure about this having no experience with the series)
It's clear that it's a lie because if it were true it would've been delivered in a form of an awesome tech demo and not an excuse for broken game.
That said however...
If there's a shared world, then running its simulation server-side makes sense. Not the game minutiae, but global state. Something like weather, simulated stock markets, etc. The environment, basically. That's not to say that SimCity has any of this, because it doesn't.
What reasonable company would justify spending tens of thousands an hour on server costs, when they could optimise their code a little more and run it for free?
So let's assume that the average person wanting to play SimCity is running a Core2Duo (released 7 years ago). It's hard to find benchmarks directly comparing a Core2 E6600 to something like an Ivy Bridge Xeon that you'd expect to find in a modern dual-socket 1U server, but even looking at a TomsHardware chart of x86 core performance can tell you that an i7-2600k is only about twice as powerful as a Pentium 4 HT660 (core-for-core) http://www.tomshardware.com/charts/x86-core-performance-comp...
Bottom line, the total cost of ownership doesn't at all make business sense to do right now. In 10 years, it very well might.
The core-for-core thing makes a huge difference in the real world that doesn't show up on single-core benchmarks. The HT660 was a good chip in its day, but it's at a four-to-one disadvantage for code that's multithreaded and/or running on a busy PC.
I think Sandy Bridge is my favorite CPU of all time. It does a truly massive amount of work without consuming significantly more power than the part it replaced.
I'm on your side, I think this is stupid, but I can't let this stand:
> I'm going to point a finger and say that this is clearly untrue, and very easy to disprove just from basic network monitoring over 5 minutes of playing the game.
c'mon. The amount of computation done can be completely uncorrelated to the number of bits sent over the network, in the same way that the effects of your saying something offhand to someone could have a massive effect on the final state of the planet.
If you send a small amount of bits over the network, the amount of responses you can get is commensurately
small. So if you aren't sending a lot of bits, it would be possible to simply have a local mapping of inputs to outputs. There is a certain size the message has to have for it to be worth it to send it instead of solving the problem locally; obviously, I don't know what that size is or whether SimCity's packages where smaller than that size. But the claim of the OP isn't totally absurd.
> If you send a small amount of bits over the network, the amount of responses you can get is commensurately small.
Oh, come on! You can entirely specify a cosmically hard problem in just a few kB. Prime factorization, anyone? Use discrete logarithms in finite fields, and you get down to handfuls of bytes.
Your conclusion is probably right, but your theoretical basis for it leaves a lot to be desired.
If a response is too small to describe the changes to the state of your city at that time, clearly it's a thick client which mostly knows how to simulate the city.
That sounds much better -- less like some pseudo information theory or complexity nonsense.
That said, I would put some small calculations on a server if I thought cheating was an issue. This wouldn't necessarily apply to single player games, though.
You don't have to transmit many bits to perform an internet search.
When you perform an internet search are you all by yourself consuming more computing resources than the i5 processor in your computer? That's unlikely, but you'd also have a difficult time replicating the functionality of Google with your CPU alone and only the storage on your own laptop.
Google is matching your query against a humongous database. In Maxis' case, the only relevant data is some aggregate data from the other cities in your region.
True, Maxis probably isn't doing much computation. rz2k is simply pointing out that Fargren's argument is incorrect. Same as what stcredzero said, "Your conclusion is probably right, but your theoretical basis for it leaves a lot to be desired."
How hard would it be to stick a proxy between the game and the servers (on the players machine), capture some data, and configure the proxy to listen for message 'X' from the game and return response 'Y'.
If that could be done (by someone with far greater skills that myself, I can do it for Web dev but not this) fairly easily and the game play went on for several hours would that not put a very big hole in EA's argument?
I mean other than the one already sitting there...
I tried that days ago with Wireshark. SimCity uses SSL for server communication and it has hard coded certificates - it does not use the OS SSL certificates. This prevents you from using a self signed cert to decrypt the data, at least without complicated patching of the game exe.
How about catching the packets before SSL? I have no knowledge about modern Windows debugging, or how Simcity might block a debugger. But I guess you could pinpoint the location of the messages just before SSL encryption, and just dump them out?
That sounds plausible. Again, I'm not a windows guy, but unless they've statically linked the SSL libraries, you should just be able to inject your own dll and capture the data on the way into the library.
I would think that they have statically linked it, which is why I thought about using a debugger to catch the data. With dynamically linked library, such as OpenSSL, it would be quite easy to capture the data.
Presumably, it would be as hard as breaking the DRM scheme that they're using to prevent people from doing that to play illicitly shared copies in offline-only mode.
There are several games and simulators which send messages to the server for processing and stream video/images back from the server. A game like this functions as a terminal of sorts. It allows the servers to farm out the complex calculations/rendering.
Also, many network multiplayer games rely on a server for calculations like timing, collision detection, bullet-hits, race position, and the like.
It doesn't apply to sim city. Sim city is not a real time game, you can even pause and accelerate time, so you don't ned servers to synchronize position or detect collisions with objects rendered on other machies. and the requirements make it clear that rendering is done on client side.
I'm going to point a finger and say that this is clearly untrue, and very easy to disprove just from basic network monitoring over 5 minutes of playing the game. I would immediately fire a systems architect that designed a single player game to compute significant calculations on our expensive [buzz word] cloud servers. Unless we use different definitions of the word significant...
It's a little ridiculous to suggest that?... plus, for zero benefit (above DRM), it would have a non-negligible affect on their bottom line if they are computing time cycles for SimCity on their own servers...
Anonymous source or not, conjecturally, it's hard to not agree with what the insider has said.