Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
I cut GTA Online loading times by 70% (2021) (nee.lv)
215 points by colinprince on Jan 9, 2024 | hide | past | favorite | 99 comments


Always a fascinating read when it pops up on HN :)

One thing to note - assumption that one might make is that "Rockstar shipped obviously ridiculously suboptimal code; how did this ever get past testing??"

However, based on the author's discussion of the code, and the issue being in inefficient parsing of an online store content, it is likely that this not only passed testing, but was sufficiently efficient in production too... upon go-live, when the store content database was small.

This likely grew slowly, and was especially bad for a specific subset of users (CPUs with poor single core performance). They may even have had trouble replicating it in non-production environments depending on their refresh policy and whatnot.

This is not to excuse it, of course; but I've seen a repeated pattern where a not-perfectly-optimal algorithm works well enough in development, testing, and initial production... then blows up in production with growth a couple of years later.

(In particular, I deal with a lot of complex SQL in my day job, and they have a nasty habit of going exponential with data growth. If you're implementing a greenfield application with no objective knowledge of its future size and growth patterns, it may be littered with trapmines despite reasonably good efforts to not make it so. Especially dangerous with "Big Bang Go-Lives" - though a game may have internal Agile sprints, I'd still consider it overall a bing-bang go-live in that majority of its content will go live at the same time, and then you get to see how your 100s/1000s/10,000s of SQL/code/libraries/etc behave:)


The loading times were atrocious even at launch though for Xbox 360. It wasn't a case of content piling up, they added a lot but probably a lot of their item catalogue was there from launch.

There was also server instability near launch which it may have been thought to be linked, as in we thought we were waiting 10 minutes for a server. But any actual server issue was probably more demand/load scaling based.


yeah it was definitely a case of not wanting to spend a comparatively small amount of money on dev time to even look into it

it was obviously not network based as connection speed didn't seem to change the time required


https://www.rockstargames.com/careers/openings/position/5761...

"Scrum", "Agile", "Lean".

The devs were may have been too busy "sprinting" and getting their assigned tickets done to think about higher level concerns like "is this good? can we make things faster or better?".

I think your points also stand though.


That's insane to achieve that without access to source code.

It's also completely baffling that a company would invest such resources building such a complex game and be fine with 6 minute load times, caused by such ... dumbness. Like selling a sportscar with a 5 litre petrol tank.


My sneaking suspicion is that the real reason that everything is so slow is that noticeable loading or processing times give the software a feeling of heft. It strokes our ego as programmers because if something takes a while to complete then it must be very complicated and important.


Interesting. I have never thought about it like this. To me it always feels like bliss if something finishes instantly,


the people maintaining GTA online systems are probably not the same ones who developed the main game (or even the initial GTA online systems)

They are probably just adding more content, but the original dev's algorithm for loading that content didn't expect a bazillion new cosmetics


Yep. The entire game is like this, massive amounts of content clumsily tacked onto something that they clearly didn’t realize would become so popular. This also shows in the interface, where basically every piece of gameplay is triggered by plumbing through the massively overcrowded Quick Menu.


I cant understand why there wasnt mass firings across managements and a post mortem, aswell as a hiring of the guy who fixed it, 6 figure reward and a new team setup to troubleshoot final build performance problems.

But I can understand thats its normal large companies are so incompetent, and at an older age I appreciate it gives a chance to the little guy or company to rise up, for now, until AI starts to catch these issues and large inefficient corporations can finally solidify their place in the hierarchy.


Because, and I'll say this in the spirit of your recommendations, that's fucking stupid.

The problem was sscanf calling strlen. You're going to fire your management and development team because one dev used sscanf where it's probably appropriate?

Then hire a guy who may or may not have actual game development experience and give him $100k because he took who knows how many hours to trace through and find this thing. The guy himself the fix would probably take a day to implement.

Finding it could have taken longer.

You want to call Rockstar incompetent, but you're simply going to ignore everything else that goes right in not only GTA Online, but across all of their games.


It’s not stupid to hire a guy with profiling experience when you have no profiling team that could address an obvious, outstanding, almost game-definitive issue which players hate and often quit after a disconnect. Maybe not this guy, but at least some.


And what is that guy going to do the rest of the time?

I'm sorry, no company is going to pay someone only to arbitrarily profile their software looking for potential bottlenecks to remove.

The only reason this story is notable is that there was a conclusion. He could have easily found nothing to fix.

It's just a stupid, knee-jerk reaction to call for the heads of everyone remotely responsible and replace them with the guy who happened to find the issue. And of course it looks like a straight shot from noticing the problem to finding the problem. I'm not going to put in every single dead end. Unless they happen to be interesting or relevant.

While the past looks like it obviously leads to the present, it only does that because we've already walked the path.


I agree that bloody solutions are excessive, but at the same time this line of reasoning is why we still have half-an-hour windows updates that overwrite maybe 500Mb in total, which costs few seconds on modern hardware. I bet that this “stupid knee-jerk reaction” brought e.g. Macs and iPhones to where they are instead of a lifeless dump full of mediocre alternatives.

And what is that guy going to do the rest of the time?

That’s classic pre-devops dilemma. Let’s fire all these slackers and get screwed next week. How about nothing to do means they’re doing an excellent job.

But in general, given this idiotic reality, you’re probably right. I just don’t adore it as much.


Having nothing to do could also mean that the original developers are doing an excellent job.

It's probably those sort of reactions that got Jobs booted from Apple the first time.

This is kind of like hiring someone who won the lottery to be your financial advisor.


I disagree.

Just for the record your swearing makes you uncivilised. I was just offering my opinion. I’m allowed to do that. I was giving my professional opinion. All your replies not only make you look like you can’t take criticism but they are also IMO mostly poor and short sighted. However I appreciate your time and input and will try and reflect on them. Have a good day man and for what it’s worth Rockstar has always been one of my favourite companies over the past 2 decades.

My opinion is if I was management in that company I would accept my fate if such a transgression occurred on my watch. I wouldn’t be butthurt about people thinking I should be fired.


You are obviously not seeing the problem. The problem was not that some dev used sscanf which lead to slow code. The problem here is that this was undiscovered for YEARS and it literally cost minutes of every single GTA online player. Management did not prioritize literally burning lifetimes away. That is the problem.


Time doesn't work like that.

Ten people wasting 6 minutes doesn't necessarily waste an hour. No one is curing cancer with the time "saved" from loading GTA Online. More likely than not, they're playing GTA Online 6 minutes less.

And to be fair to Rockstar, which the author was, this problem lurking in the standard library isn't really to be expected. You expect your standard libraries to be decently optimized.

And why it was undiscovered for years has a very simple answer: It wasn't a priority.

Why should that problem be prioritized over everything else? Everything else that could "burn lifetimes away"? Everything that could make the game crash? Or cause people to stop playing? It's quite possible that this issue was always slightly less important than whatever else may have come up.


> Ten people wasting 6 minutes doesn't necessarily waste an hour. No one is curing cancer with the time "saved" from loading GTA Online. More likely than not, they're playing GTA Online 6 minutes less.

That's an assertion. One I don't accept without some strong backing. The extra 5 minutes might mean a discussion somewhere is a bit longer leading to valuable insight. It might mean just trying that last thing before finishing up which you wouldn't try because it shouldn't work anyway. You can imagine any world changing situation which only occurs because of those extra few minutes.

> Why should that problem be prioritized over everything else?

Because it's burning away lifetimes. Besides no one would have complained if this was in the game for a few months and then fixed. It was in the game for YEARS. That is not "prioritzed over everything else" that is "the priority is so low you have to dive to the mariana trench to find the ticket".


>One I don't accept without some strong backing.

I need to see some data on this.


I never thought someone as seemingly obtuse as you could exist but you explained why rockstar failed to capitalise further on their market domination


I remember reading this article when it first came out. It still gives me the same feeling of inadequacy. There are some people walking around us who just belong to a different league.


Discussion from a previous submission: https://news.ycombinator.com/item?id=26296339


The link to HN was added to the blog post, itself: https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times...


Great article,

I am baffled nobody at R* cared to investigate a 6min load time that was there for years. Doesn't sound very AAA-gamedev to me.


I can exactly imagine how it would happen in a large company:

1. It's a boiling frog problem which developed over several years. It started fast when the JSON file was small but then got a little bit slower each time new content is added.

2. It's also not exactly an "interesting" part of the code base. IIRC it's a JSON file which configures the ingame shop(?). This was probably delegated to a summer intern to implement in a couple of days, maybe in an entirely different department from the people who develop the actual core game.

3. New shop items are probably added by a content team without programmers in the loop who would know by instinct that something must be wrong. People on that content team also most likely don't stay on that team for very long, which together with the boiling frog problem above means that everybody thinks "it's always been like that".

4. Any specific feedback from the outside probably gets lost in the noise of other requests.

5. ...plus a general 'not my problem' attitude common in big organizations and an always-full kanban board ;)


This isn't some obscure b2b product, this is GTA5. Surely somebody at rockstar had, yknow, played the game? And noticed they were losing hours of their life on a loading screen? Hell, it's so long you could file a complete bug report in that time.

My theory is the fix existed internally but was never merged for weird political reasons. Some have guessed it was due to advertising being done on the loading screen. Or it could have been pure dysfunction, like how fixes for all the notepad.exe problems apparently existed at Microsoft for decades, but it took a full rewrite to have working undo.


> This isn't some obscure b2b product, this is GTA5.

The 'large-organization-dynamics' are all the same though across all industries. I bet enough people down in the trenches noticed, but nobody pushed enough to get the problem investigated and fixed. There's probably several dozen bug duplicates about that problem buried somewhere in GTA5's JIRA (and indeed probably also with exact instructions how to fix, and maybe even somebody did that work but then didn't push enough to get it integrated).


This is worth chewing on. It holds so much truth.

I work in a larger business but am dealing with a similar problem. Trivial implementation/fix; nearing two years into the effort.

What am I doing? Changing 'proxy.remoteURL' for our Docker registries from a painfully slow upstream to a faster one.

You can have people in pain with a solution in hand and still find it difficult to sell.


> What am I doing? Changing 'proxy.remoteURL' for our Docker registries from a painfully slow upstream to a faster one.

> You can have people in pain with a solution in hand and still find it difficult to sell.

"Well, we don't know what side effects that might have"


> Surely somebody at rockstar had, yknow, played the game? And noticed they were losing hours of their life on a loading screen?

Actually, my understanding is that most devs don't really play the games they build, at least not in the way that people do. (Anecdotal evidence is that quality-of-life features tend not to come from developers very often). And even when they do play the games, internally, it's likely on an internal branch, with internal mocking of things like servers and the like, so a loading time bug that scales quadratically with content is unlikely to be something that would surface in internal playtesting. I also doubt they're playing their games on their own time; if you're already working 40 hours or so each week on the game, it's probably not a feeling of unwinding to put in another 5 or 6 hours on your own time after you come home from work.


> This isn't some obscure b2b product, this is GTA5. Surely somebody at rockstar had, yknow, played the game? And noticed they were losing hours of their life on a loading screen? Hell, it's so long you could file a complete bug report in that time.

My guess is that people noticed but nobody was in the right position to debug - their internal tooling may have been that this code path was usually skipped or behaved differently on internal builds. And debugging a full production build was something many devs didn't do because it was painful.


Plus: Most developers probably don't use the full store bundle and skip it, thus don't notice it at all.


I also cannot explain it. If I was a dev on a system that had a 6 minute loading time, I'd be stopping what I was supposed to be doing and debugging it after a couple of days of suffering through it.


I started working on a codebase where the git repo is 56gb. Turns out that it's a bunch of binary files that were committed and updated a buncha times more than 2y ago. They're not even in the main branch anymore. Nobody in the main team working on that seem to have cared enough to fix it before.

Don't underestimate the power of complacency


It's kinda careless to rewrite history just because a repo is large. Any reason you aren't doing a partial/shallow clone?


Why though? I understand the point of keeping some history, but what's the point of keeping individual commits from years ago? It's not as if you'd cherry pick anything from back then. And then, you can still keep a full version of the history in another repo if you really have auditing requirements that mandate keeping history forever.


> what's the point of keeping individual commits from years ago? It's not as if you'd cherry pick anything from back then.

If you extend the logic, what's the point of keeping any commits from years ago? What's a good cut off date to just start discarding history? Rewriting it?

But in the same breath, what's the point of cloning commits from years ago? It's not as if you'd cherry pick anything from back then. And that's my point, the former requires an org decision and active action, and the other is a blobless clone. The nice thing about the blobless clone is you can always change your mind, you can get old blobs if you find you need it. But if you threw out old commits, and you need them later... that's the brakes.


That's the fun part: their dev systems almost certainly are not console dev kits, or consumer PCs.

They're absolutely jacked workstations that probably loaded it in a blazing 2.7 minutes!


Or even simpler, they have flags to deactivate loading that catalog, so the issue doesn't exist for them


But if that's true, then there's even less excuse for not fixing the bug because it would imply at least one person knew the slow loads were caused by loading a small 10MB json file.


Maybe, but not necessarily. The devs may even have been begging to fix it, but project management kept prioritizing other items in the backlog.

If there's a million fires and half of those crash the game, fixing something like a load time might easily stay under the radar.

And that's even before we consider internal political issues. The code could have been owned by a group that was refusing any PRs that would have fixed the issue. Or maybe the code was written by the big boss' nephew and fixing it would have removed his contribution.


Yeah there's definitely a scenario where the guideline was "let's hit the deadline, we'll fix it after"


I imagine the focus is on creating new things instead of fixing old things, since that will have more obvious and marketable impacts. The whales might tolerate longer load times over stale content.


If no one carries the responsibility, no one will care. Maybe the team who worked on that moved on and what’s done is done. Not sure if R* has a team that does optimization only?


Why? Clowns STILL buy GTA by the truckload.


Why clowns?


Because they keep supporting garbage like 6 minute load times with their money.

6 Minutes?!?!

Are we using Commodore 1541 FDDs again?

Per the article, most of that wait is garbage coding and that's apparently just fine, because....clowns will continue to buy it.


They buy the game for the... game. The game is fun. The loading times are bad but don't affect gameplay once the game starts (for the most part, but they can in heists I guess). Still, they buy the game for the game content not for the loading times


It's about incentives. Why would a company be incentivized to write better code, when people are willing to pay full price and more for, what at times, at release, is garbage.

They won't be. They will push out worse and worse code and it's fine, because it's still being purchased.

Have you heard of a game called Fallout 76? or maybe Red Dead Redemption 2?

Both released as hot garbage and apparently, that was fine.


This is my favorite reverse engineering article! Feel like I've seen it on ycombinator a few times now


Remember reading this way back, glad to see he/she got a 10k reward from R*, great piece of debug work and a bit of inspiration to improve my own skills.


10k reward is a joke. Reminds me of the guy who found you can submit a sitemap for a different domain in Google Webmaster Tools (if the site suffers from open redirect issues). You would then get huge search traffic for a brand new domain. He first got $1337 reward which was then "upgraded" to $5000 for a bug that's worth millions.


10k seems low considering millions of people wasted millions of hours of their time looking at a loading screen.


Like the story about Steve Jobs wanting the Mac to boot faster

"You know, I've been thinking about it. How many people are going to be using the Macintosh? A million? No, more than that. In a few years, I bet five million people will be booting up their Macintoshes at least once a day."

"Well, let's say you can shave 10 seconds off of the boot time. Multiply that by five million users and thats 50 million seconds, every single day. Over a year, that's probably dozens of lifetimes. So if you make it boot ten seconds faster, you've saved a dozen lives. That's really worth it, don't you think?"

https://www.folklore.org/Saving_Lives.html


Giving one person an extra day is very different from giving 8640 people an extra 10 seconds.

Even if you give them an extra 10 seconds every day. You would need to give them an extra 10 seconds several times throughout the day.

Time doesn't really aggregate in people like other things.


I agree with you, but let's play devil's advocate. People waiting already paid for the game, so the business value here is hard to justify. And $10k is a decent amount of $ for the labor involved.


GTA online made I think a few billion dollars in microtransactions. It's one of the biggest money maker in gaming history and most of it is in post game purchase revenue.


GTA online also profits from existing consumers by selling in-game money.


How do I become that good in debugging stuff like this? I'm guessing he works with gamedev and does this sort of debugging and reverse engineering regularly?


I learned some of this stuff being motivated by doing mods that interacts directly with game's memory.

I mostly build free-cameras, which allows you to detach the camera from characters, and for that I had to learn x86 assembly, some disassembler tools and how compilers generally work.

I think it's super rewarding because while you're seeing bits and bytes, and crashes, most of the day, at some point you can see you achieved something by being able to freely move in a restricted world.

I was surprised how far you can get with Cheat Engine. Its disassembler is really good, and you can do injections directly, if you then want to move to your own advanced stuff, you can build frameworks to do all of that using Windows APIs.


Learn to use a performance profiler and then a debugger/disassembler/whatever fit you needs. The profiler tells you where the slowdown happens, the rest then let you figure out what actually happens.

By learn to use I mean really do, it's like refactoring tools, it's sort of boring and tiring to learn those, you end skipping large parts because "yeah I already know / it's obvious", then you need one day for an emergency / something right now and feel like you're fighting against it.

For those three (and some others), the key to being very good at it is learning all the small tricks and details, when you don't need it, so the day you do it lets you skip 90% of the work.

As an overall point, I'm sure many people have various different experiences, but for myself while being in high school I needed to parse "replay" files for some games to extract and process information and allow players to post their replays to get stats on web forums, and it helped me to learn a great deal about how to properly reverse engineer and figure out stuff, things that I have no idea how I would have learned otherwise.


Excuse my ignorance... but what refactoring tools do you speak of? I'm accustomed to the basic ones in VS code but I have a feeling I'm missing something greater judging by your comment


There are probably multiple refactoring tools which would fit the description by the parent.

Personally I have experience with JetBrains ReSharper for (full) Visual Studio, it has loads of functionality which makes it very fast for me to navigate and refactor even very large C# projects.

JetBrains has different products for many different languages and I expect they offer similar capabilities in each.

Learning all the features at once is impractical, but if you learn a new one every now and again, after a few months or years you become incredibly proficient.

I had a similar experience with conversation macros back in the day when I was a GM for WoW.

You don’t make too many at once as you’ll never remember them all, instead you make them kind of as you need them, which tends to be situations like “this is the third time today I’m writing this long explanation”.


Find reverse engineering discord servers or IRC channels. READ THE FAQ/WIKI FIRST and try to fit in.

Don't jump in with "hi please teach me" or asking questions in the FAQ/wiki or found easily on the web.

Try to give back to the community (like fixing/improving the FAQ/Wiki) as early as possible.


Rockstar fixed that bug because of this guy... Awesome!


And paid him 10k as bounty.


One wonders how many millions (or tens of millions) they lost from people giving up over a 10 minute loading screen...


Wikipedia mentions revenues for the game being $8.5 billion. It obviously didn't hurt too much!


On the other hand, if you consider the losses as a percentage based on people who gave up on the margin, you can easily derive losses relative to what they would have gotten without the bug in the tens of millions of dollars, and hundreds of millions is on the table.

I can't "prove" that, because the data isn't there, but it is a completely reasonable guess. At an 8 minute load time it is reasonable to guess there were a lot of people who stopped playing and thus stopped spending because they thought their install was crashed or corrupted, and just silently wandered away. I certainly wouldn't. I've got some games that load more slowly than others but I don't think I've waited multiple minutes since the Commodore 64 era.


Chiming in here as someone who used to play in competitive GTA leagues. I quit playing because of too much time spent in load screens. It was worse than just this slow loading bug: random disconnects that triggered loading again got more and more common over time, too.


One could argue the bug is in scanf on Windows?


The problem was that the JSON parsing code was doing exceptionally stupid things in the first place (like calling scanf), which doesn't matter on small JSON files used in testing but exploded in production once that JSON file was growing to several MBytes. Just dropping in a different JSON parser library was probably enough to fix the problem on R*'s side.


Why is it "exceptionally stupid"? sscanf is basically a slighlty more primitive regex engine than e.g. PCRE and I suspect it would work about as fast (if it weren't for that silly strlen() call) — and there are lexers that are basically just a loop with a match() call in it with

    (?P<NUMBER>?\d+(\.\d*)?)|(?P<ASSIGN>:=)|(?P<SEMI>;)|(?P<ID>[A-Za-z_][A-Za-z0-9_]*)|(?P<ARITH>[-+*/])|(?P<NEWLINE>\n)|(?P<WHITESPACE>[ \t\r]+)|(?P<MISMATCH>.)
as the pattern or something like that over the input string, and that is not generally considered to be a stupid way to write a lexer. Why would sscanf be?


Most of the standard C library implementations, including FreeBSD's libc [0] and glibc [1] have sscanf implemented like that, by calling fscanf on a dummy FILE object (with its size populated by strlen() at every call, no caching).

Of course, there are implementations whose authors thought about that and decided to do the reasonable thing instead, e.g. musl [2] and Plauger's old stdlib [3].

[0] https://github.com/lattera/freebsd/blob/master/lib/libc/stdi...

[1] https://github.com/bminor/glibc/blob/dff8da6b3e89b986bb7f6b1... , https://github.com/bminor/glibc/blob/dff8da6b3e89b986bb7f6b1... , https://github.com/bminor/glibc/blob/dff8da6b3e89b986bb7f6b1... — two additional files and five times as much code compared to FreeBSD's implementation for pretty much the same functionality, wow.

[2] https://git.musl-libc.org/cgit/musl/tree/src/stdio/vsscanf.c

[3] https://github.com/wuzhouhui/c_standard_lib/blob/master/STDI...


No, the bug is not in Windows. scanf can not cache string length as it can't guarantee x, and x + offset are the same thing, nor can it guarantee the string was unmodified since the last call.

Windows provides _snscanf_s if you want to keep track of the string length yourself instead of having it recompute it each time.


The fix would have nothing to do with caching the string length across multiple calls to sscanf. The fix would be to have sscanf not call strlen on the input string in the first place, and instead only process the input string up to the point where it satisfies the format string or the input string terminates. After all, regular scanf works fine without the length of stdin. As TFA also says:

>To be fair I had no idea most sscanf implementations called strlen so I can’t blame the developer who wrote this. I would assume it just scanned byte by byte and could stop on a NULL.

The author's replacement strlen does the "cache the length across calls" thing only because bolting that on top of the default strlen was easier than doing the lazy parsing thing, since the latter would've required making an actual sscanf implementation to do that from scratch.


The C standard library makes no performance guarantees. Besides, scanf is awful and should be avoided wherever possible.


scanf a great way to cause a memory over/underun. Dont think so? Go read the docs for the different CRT's out there. None of them really match on what all the % items mean.


It's a shame that's their only post (https://nee.lv/)


Agreed. I was rather hoping to see them post something about the GTA V source code leak.


FYI, list of tools used:

- Luke Stackwalker: For stack sampling (perf analysis)

- ???: industry-standard disassembler

- Process Dump: Dump process memory to file (for obfuscated code analysis)

- x64dbg: For debug stepping.

- MinHook: For patching/modding a binary executable.


???: industry-standard disassembler

guess that's IDA Pro


Definitely IDA Pro.

It is an interesting case in piracy, as being the go-to software for reverse engineers, it also has the biggest target on its back for software crackers. Despite this, newer versions are not always cracked and released publicly so quickly. It seems each license owner gets their own custom watermarked build of the software. The crack of version 7.7 was leaked from Think-Cell, the excel add-ons company. 8.3 was recently seen released, with an anonymous supplier and Vietnamese cracking group behind it.


The funny thing is IDA added an x64 decompiler for the free version right after the post. The piracy could have been avoided if it wasn't for the timing.


Other nice thoughts on it

I cut GTA Online loading times (2021) (June 9, 2022 — 476 points, 213 comments) https://news.ycombinator.com/item?id=31681515

How I cut GTA Online loading times by 70% (February 28, 2021 — 3883 points, 699 comments) https://news.ycombinator.com/item?id=26296339


Is there a safe way to do this and not get your online accounts banned? I'm interested in tinkering in this way but I'm not a cheater and I don't want to be confused for a cheater and banned.


Loading times for GTA Online are still insanely long on PC


Yeah I remember this blog post the first time it made the rounds

Played recently and thought Rockstar must not have actioned the advice in the blog

But no it's just it is still slow but they did fix these issues kudos to them but performance still needs attention

People who play this regularly have their phones out/youtube videos on waiting for these load times


You must've never played before this was fixed.


Are you using an HDD maybe?


I mean GTA Online is a microtranscation/grindfest experience, mostly to milk kids of their money. It's not really worth the time due to how grindy it is.

Most PC gamers play multiplayer mods like RageMP that allow connecting to servers with well-made scripted gamemodes like RP, but also stuff like death match where hackers are actually banned I'd guess.


I love reading this every time it gets posted. This guy is a genius and rockstar should feel bad.


i read this article when it happened some time ago

this really makes me think:

if this obvious flaw exists ... imagine what other, more complicated and opaque mistakes exists and we're not aware of it ...

just food for thought


are loading times any better now? i haven’t played in a while


The article says they are.


oh I meant these days on gta online. I had checked last year or so and it was still pretty terrible with frequent drops etc.


Amazing story!


Good job :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: