Atari in 1983 went from the leader in video game production to completely imploding. In 1979, four of their best programmers left after demanding equity in the sales and were refused. Their CEO Ray Kassar famously said, "You’re no more important to those projects than the person on the assembly line who put them together. You’re a dime a dozen. You’re not unique. Anybody can do a cartridge.". They left and formed Activision which led to a sequence of failures by Atari until they finally died unable to compete with the likes of Intellivision, Colecovision, and Commodore.
A friend of mine went to work for a small game studio in Oklahoma that'd gotten some acclaim for their quake mod pack. They took that momentum and started on their own novel IP as a quake licensee. They made a ahem mildly successful game named Medal of Honor.
Some time down the road, the owner of the studio didn't want to share the wealth.
As a result the top programmers, designers, etc, grouped up and negotiated a deal to become a 2nd party dev studio with a competing publisher. Nearly the entire company left with them. They couldn't take the IP with them so they rebranded their new game franchise as Call of Duty.
That studio owner literally made a billion dollar mistake by not simply being fair early on to the team. Never, ever, treat a team that has achieved rare success as replaceable cogs. If they've shipped, they can find more money people any time they want.
>> That studio owner literally made a billion dollar mistake by not simply being fair early on to the team. Never, ever, treat a team that has achieved rare success as replaceable cogs. If they've shipped, they can find more money people any time they want.
What prevented them from offering bad deals that are common today? Some examples I've seen:
- give lots of equity, but vesting over long timelines
- give no refreshers, if people leave, they lost lots of unvested
- stay private for a long time...equity is almost unsellable and theoretical only
- give lots of equity, but lag on salary and save big
- give lots of equity, but leave people with huge unfunded tax liabilities if they want to leave company
- give "lots" of equity which is worthless if people actually saw the cap table
I dont feel any of the above are good practices, but they are common practices for equity theatre
When I had a job with similar “incentives”, my comment to coworkers was that it wasn’t quite a carrot-on-a-stick, it was the promise of a picture of a carrot-on-a-stick.
> That studio owner literally made a billion dollar mistake by not simply being fair early on to the team. Never, ever, treat a team that has achieved rare success as replaceable cogs. If they've shipped, they can find more money people any time they want.
That wisdom applies to that specific industry. In gaming, the people you hire are the asset. However the same doesn't apply to all industries. Sometimes people are more and sometimes less replaceable. If you are running a fast food chain and manage to piss all your employees, yes, it is likely a problem, but if you hire new people and fix your behaviour it is likely that the business will continue to run as usual.
Well, it depends on your definition of "fair". You're going by the market definition -- but the market definition of "fair" is often quite unfair by other definitions.
Considering that markets exist whenever 2 or more individuals gather, without regard to any other factors… it’s pretty much a fundamental force like gravity. In fact even ant colonies experience market forces so it’s practically impossible for it to not exist.
Does it matter if a lot of folks have different definitions of gravity?
> Considering that markets exist whenever 2 or more individuals gather
In the most general sense of "market", this is not wrong. However, you're talking from the point of view of a rather specific market theory. That market theory is not a fundamental force, it's just one of many ways of doing things.
It also has a very narrow definition of "fair", which makes sense within its own world, but I would argue is not generalizable outside of that world.
What's the 'rather specific market theory'? As far as I understand that is the dictionary definition of a market. Here's Meriam-webster:
market, noun, often attributive
mar· ket | \ ˈmär-kət \
Definition of market (Entry 1 of 2)
1.
a(1) : a meeting together of people for the purpose of trade by private purchase and sale and usually not by auction
(2) : the people assembled at such a meeting
b(1) : a public place where a market is held
especially : a place where provisions are sold at wholesale
a farmers' market
(2) : a retail establishment usually of a specified kind
a fish market
2. archaic : the act or an instance of buying and selling
3. : the rate or price offered for a commodity or security
4.
a(1) : a geographic area of demand for commodities or services
(2) : a specified category of potential buyers
the youth market
b : the course of commercial activity by which the exchange of commodities is effected : extent of demand
the market is dull
c(1) : an opportunity for selling
(2) : the available supply of or potential demand for specified goods or services
the labor market
d : the area of economic activity in which buyers and sellers come together and the forces of supply and demand affect prices
If I had a gun to the head of everyone in town, and everyone in town mysteriously agrees to sell their labor for free, does that mean that cost of labor is fair?
There are circumstances outside of the price which affect the fairness of the price.
What an interesting story. It made me look up how medal of honor started.
Could you clarify a few things? I don't think the story adds up.
Wikipedia has the followin information.
Medal of Honor was made by DreamWorks interactive. [..] Filmmaker Steven Spielberg Spielberg founded DreamWorks Interactive in 1995. [1]
And:
Danger Close Games (formerly DreamWorks Interactive LLC and EA Los Angeles) was an American video game developer based in Los Angeles. [2]
This doesn't sound like 'a small game studio in Oklahoma'.
The first two Playstation only MoH games were not exactly failures, but they were little more than Goldeneye clones with WW2 themes.
The first PC game is really the start of what we think of as the Medal of Honor franchise proper.
The team that bailed founded Infinity Ward, which was the origin of Call of Duty, or at least the first like 8 games in the franchise.
So yes, my story does in fact check out. Which is because I lived it. My friend tried to get me to join the team for 2 years because he knew they were onto something, but I'd fled a childhood in Kansas to build a life on the west coast and wasn't looking to move back to Tulsa of all places. That proved to be a bad career decision but I'm ok with it as a life decision.
Can you have some self awareness of how annoying it is for you to adopt this skeptical fact checker tone when you have so little familiarity with the events and people involved you don't even really understand what to google for and which wikipedia articles to read?
Edit: I'm annoyed because if you tell someone their story doesn't check out, calling me a lair in this case as the story is direct personal experience, you probably need more of a basis for that claim than googling a wikipedia article about a story you'd never heard of 5 minutes ago.
They asked very politely for you to clarify some details after they tried themselves but were unable to verify it by looking it up. Your hostility is unwarranted and rude. People are not psychic, it was quite reasonable for them to ask.
Moreover, people often forget that the poster probably took some time out of their life to chronicle something. Starting by acknowledging personal experience and their effort documenting it goes along way towards building a collaborative discussion.
I think there's hostility on both sides here. "Could you clarify a few things?" is one thing, but "I don't think the story adds up" is a direct accusation of lying.
Maybe it's an unfair assumption on my part, but the post starts out exactly in the format of deliberately written /r/IAmVerySmart satires. But it's not. Just because you're using polite vocabulary does not mean bare toothed sentiment doesn't read through.
The problem is that the anecdote, as originally told, was told poorly. This:
> They took that momentum and started on their own novel IP as a quake licensee. They made a ahem mildly successful game named Medal of Honor
... makes it sound as if the studio started the franchise—which is not helped by the fact that it says "Medal of Honor" (which is apparently a thing), not "Medal of Honor: Allied Assault" (which is apparently the thing they actually meant). Anyone interested who tries to follow up based on these breadcrumbs is going to run into an issue. That is anyone—it doesn't take someone with a predisposition to being an asshole to end up here; even someone who read the initial comment and though, "wow, that's cool; I'm interested in learning more", and then proceeded to try to learn more would have gotten tripped up this way. (It's only by starting at the opposite end—with Call of Duty—and working backwards to its origin story that you're going to be able to resolve this.)
To make out as if someone is being automatically uncharitable and then airing emotion-driven grievances about it is, perversely, the most uncharitable thing (and, for the reasons just mentioned, perhaps only uncharitable thing) to have actually happened here.
It's easier to start from the end by googling "Call of Duty founders" than starting from Medal of Honor and hoping there aren't too many branches with only one leading to Call of Duty (which is the case here).
"Call of Duty founders" -> first link: https://en.wikipedia.org/wiki/Infinity_Ward -> 3rd sentence: All of the 22 original team members of Infinity Ward came from the team that had worked on Medal of Honor: Allied Assault
The tragedy of the written word between people that don't know each other. One could say "Could you clarify a few things?" in anger or "I don't think the story adds up" smiling and with a friendly tone. In writing the intention of the writer is in the mind of the reader.
>One could say ... "I don't think the story adds up" smiling and with a friendly tone.
While you can call bullshit in a friendly way, it's almost certainly better to assume that you are wrong if you think the other party is equally or more credible than you.
I could be wrong but it was not so easy to write to strangers until very recently. Except laws, books (with plenty of space to make the context clear) and journalism (same thing) I think that most written communication was letters to friends and relatives. Again, a lot of shared context. It's the internet (forums, comments on sites and social media) that lets us write to strangers maybe more often than we're writing to people we know.
> They asked very politely for you to clarify some details after they tried themselves but were unable to verify it by looking it up.
Not really. GP straight up accused the OP of lying, under a thick but transparent layer of euphemisms. Coming at someone out of nowhere with accusations that "your story doesn't add up", specially after failing to do their homework, is the opposite of politeness.
This isn't wikipedia; claims on these forums do not need to be substantiated with fact. I agree with the sibling comments that there is too much fact-checking here and it is rude AF
Storytelling does not require that one cites their sources. Especially when that source is, "I lived it"
You may have found it annoying, but I think it's good to have a healthy level of skepticism on the internet about stories whose source is 'a friend of mine'.
I agree OP could have been a bit less confrontational, but...
> The first PC game is really the start of what we think of as the Medal of Honor franchise proper.
> They took that momentum and started on their own novel IP [...] They couldn't take the IP with them
That may be your opinion, it certainly isn't mine, having played both the Playstation games and none of the PC games. I very clearly remember the splash screen for Dreamworks on starting up the first Medal of Honor game. I'm still not sure how the third game in a franchise could be considered 'novel IP', especially as it seems they were approached by Dreamworks[1], so it's not surprising they couldn't take it anywhere else.
Without the explanation above, I would have dismissed your comment as nonsense out of hand without bothering to engage.
However, OP questioned, you clarified, and I learned something. Choosing your own definition of when the franchise started made it very difficult to accept your comment as it stood initially though.
Perhaps you could also have some self-awareness of how often people post about 'my friend who told me this anecdote about this big thing' with red flags in their story, and how much it's important not to believe everything you read?
The PC games done by 2015 nee Infinity Ward were basically them getting to do whatever they wanted, vs being handed a prepared design bible as was common for 2nd party dev in that era. That team absolutely deserves credit for what people commonly call MoH. Great you loved the PSX games. You're rather alone in considering them the same thing in all but brand.
I don't think they were annoyed by the follow-up, they were annoyed by being softly accused of an inaccurate story and not being given the benefit of the doubt. Removing the "I don't think the story adds up." and changing the period of "This doesn't sound like 'a small game studio in Oklahoma'." to a question mark would've have helped make it less accusatory.
In my opinion the original story was a bit misleading though, reading as if the small game company's novel IP was the original Medal of Honor, so I do think the follow-up was warranted:
"They took that momentum and started on their own novel IP as a quake licensee. They made a ahem mildly successful game named Medal of Honor."
I also think the response to the questioning was a lot more hostile than it needed to be but ultimately that there was rudeness on both sides
HN is the only place on the web that I know of where someone will tell you--someone who has experience in X, Y or Z--that you are wrong, or "sealion" you, or better yet, attempt to correct you with their armchair experience.
Then, you'll be reprimanded for pointing it out, asking you to "not do that here."
I wish moderation would curtail this obnoxious behavior, because I see HN as a place where experts can detail their experience, and over the years I see more and more amateur butt in or sealioning behavior take place, and people I know have left over it.
Intense levels of plausibly-deniable passive aggression, and absolutely god-awful intent being treated as something better to the detriment of discourse thanks to the "assume positive intent" rule, are what I consider some of the core defining features of the HN experience.
I dunno if it does any good, but these days when I smell ill intent beneath the surface of a "just asking questions" post I just flag the bastards rather than trying to help them by answering. Responding is simply feeding trolls, and HN threads are full of that sort of thing. Argumentative jerks who are just trying to argue, while staying just civil enough that they don't get slapped down (not too quickly, anyway).
The problem is that your figurative sense of smell may not be correct. It really is quite easy to mistake an honest question or assertion as passive aggression or sarcasm in text form.
Maybe, but I can assure you from back when I tried to engage in good faith it's pretty damn accurate. It doesn't hurt that HN's a target-rich environment for such a detector, of course.
I had the feeling my honest questions were not-rarely seen as sarcasms or bad faith arguments. Probably has some overlap with autistic spectrum disorders.
And I often wished I could specify “please read this comment in literally the way I wrote it and don’t try to find a non-existing message between the lines”. Alas.
> And I often wished I could specify “please read this comment in literally the way I wrote it and don’t try to find a non-existing message between the lines”. Alas.
This is definitely a problem, and while typical on the Web more broadly, there are a few flavors of it that are more common here than other places. You have my sympathy, and I do try to be aware of my own limitations and don't just run around flag-happy for anything I might be able to read as having a bad tone. Though I'm sure I mess up sometimes.
However—there's a certain kind of needling, brash posting style that's nearly always just someone trying to tee themselves up for some usually-idiotic rant or series of needlessly-aggressive arguments they have prepared but don't yet have an entry to post without being off-topic, that's often initiated with a question that looks kinda innocent but's just a little off. That's the "smell" I mean, and it's the kind I've learned to just flag without trying to "help" (due to assuming good faith), and I'm pretty sure I have a low false-positive rate on those. This is, from what I can tell, an extremely successful approach to trolling HN (I'm quite certain a few posters do it for that express purpose, though I do think the typical motive is different and not purely aimed at creating chaos and bad feelings), and one that HN has no good defense against except cleaning it up after the fact, which is often after the whole discussion thread's mostly dead, anyway.
It's not unusual for half the posts in a thread to stem from these kinds of premeditated argument-prompts that were never intended to curiously explore the problem space (though they may, for a time, masquerade as such) and to end up acrimonious and/or in massive wheel-spinning flame wars—as there's no other way it could have gone, because the instigator was looking for a fight, even if they weren't trying to troll per se, and were relying on assume-good-faith engagement to let the embers mature into a full-blown fire so they could embark on their righteous crusade or whatever it is they think they're doing, rather than being ignored (again: for god's sake, give us user ignore-lists—that and making poster identity a little more prominent so we can more easily recognize patterns, rather than instances, of behavior would help so much and I bet those flame sub-threads would get a lot quieter) or instantly called out and told to fuck off as they might on other sites that lack strong adherence to the "assume good faith" rule.
To avoid just shitting on the site (I'm not here because I hate it... though I do think some parodies and unkind criticisms of HN are closer to the truth, than its own collective self-image is), if I could pick out one cultural thing I really like about HN, it's the relative lack of value-free posts about obvious typos or accidental word omissions or that sort of thing. You see occasional corrections of actual usage or spelling mistakes, where the poster seems not to have slipped up but to actually have a poor idea of what's most-correct and to have done the wrong thing unknowingly, but on purpose, but those are usually polite and at least convey potentially-useful information. But, very little "did you mean X LOL?" where every single person reading it can tell that yes, they meant X, and simply made a mistake. That shit's really common on some other corners of the Web and it's just the worst. I think that quality's mainly a consequence of HN being pretty decent at policing blatantly low-value posts in general.
> Lately HN feels more like a bunch of lawyers quibbling over semantics...
There is no semantics challenge in a random coming at someone out of the blue with accusations of being a liar. The only lawyering involved is determining if it would represent libel or slander.
>Yet 2015 would never get the chance to make another Medal of Honor. EA decided to take all development for the franchise in-house. Morale was low amongst the team and they were looking to start up on its own.
>We had bonded as a team, but decided we wanted to work with new management. Many members of the team were actually going to leave to find new jobs, regardless of potential royalties coming in from Medal of Honor.
>After leaving 2015 we were working with a major publisher. For legal reasons I will say things didn’t go as planned with it. We were left in a situation of unpaid milestones that were delivered and no finances to operate on,” says Thomas.
>The company was potentially going to disband. In a last ditch effort our then president, Grant Collier sent out a signal to all the major publishers in the industry letting them know that the majority of the Medal of Honor: Allied Assault team was available. Within days of closing the doors on the studio, Activision responded immediately with an offer.”
Yeah, I'm giving a simplified version, and also avoiding disclosing some details that might blow back on my friend. It was considerably more nasty than that article portrays via the quotes.
Black Isle is another great example. Interplay was going under and selling off any IP of value to get some cash out of the end of the road. The staff, seeing the writing on the wall, quit more or less en masse and started Obsidian together. The name is even a bit of a pun- Obsidian, a black volcanic glass, is what you might expect a Black Isle to be made from.
It's very similar with tech startups. Once you've been around the washing machine loop a couple times you realize just how much of this stuff is arbitrary and luck dependent. Unicorns born upon butterfly's wings. Having a certain background lets you buy more chances at the luck part. It's not fair. It is.
Personally my read on this is we should have some humility about how unpredictable this all is.
An entire company 'quits' and just 're-does the thing' is almost assuredly theft of know-now and IP, but more than that theft of the operating modality.
It takes in incredible amount of work, risk, investment etc. to 'get something up and going' - with all of the parts working.
Any time you walk into a company you'll see what looks like 'things working' on some level, usually that took incredible trials and travails.
It's a bit like 'decent code' - it takes iterations, after which, it's 'obvious in hindsight'.
Every coder knows it's 'figuring it out' that's hard, whereas doing it a second time is easy.
Employees who tool 100% salary to start, without higher risk equity, and then wanted to 'trade after the fact' shouldn't be miffed - they just shouldn't have taken the job if what they wanted was equity.
It could entirely be the case of cockroach management giving horrible terms to everyone including underpayment etc. but these stories are often one-sided.
I'm working with a company right now that I've discovered has a seemingly 'simple' product. It took this young girl 4 years of struggle (and failure before) that, to get this thing where it is and establish all the sales relationships. I'm sure I could duplicate it quickly with minimal resources (I wouldn't do that to her), but it has dawned on me how much effort it takes to move things forward.
Here is the story, and it doesn't really speak to some kind of greedy action by 2015, the original game devs. More subtle than that. More like the original team, which was assembled by EA, liked working together, and were lured away by another studio as a team.
> It takes in incredible amount of work, risk, investment etc. to 'get something up and going' - with all of the parts working.
So.. it's fair if the people who did all the hard work ask for some form of participation? If all it takes to duplicate a product is money and the people with know-how, than the capital is a rather marginal contribution?
And I am deliberately talking about know-how, and not IP here. You cannot apply copyright to the knowledge of your employees.
I mean, you can call it IP theft, I can call it wage theft. The article quotes one of the devs as saying they had "unpaid milestones", which reads an awful lot like the "major publisher" he didn't want to name for legal reasons had violated the one term that mattered: the part where they pay for the game.
The lesson to take away here, for management, is that you can't get away with everything forever. Whether you view the actions of 2015 as IP theft or just desserts, the fact remains that it wouldn't have happened if the team had A) gotten paid and B) gotten the terms they asked for. I'd be demanding a better deal, too, if my publisher mysteriously forgot to pay for a milestone.
The lesson for 'management' is get better contracts and don't invest and develop people who will walk out with your stuff.
Item 'A' is a bit more reasonable, people not getting paid is bad.
But item 'B' is not. Sorry, you don't just get to ask for a better deal after the fact, because it finally worked out and now in 20/20 hindsight you want a cut.
But why not? Why shouldn't there be a process for renegotiating a contract? Especially in cases like this, where the employees are still producing things for the management--I would understand if you were renegotiating JUST on an existing product, because renegotiating on a deal that's already over makes the deal drag on unnecessarily, but these people were probably working on a new game while talking about renegotiating their contract. They weren't just talking about their compensation for the game they'd already finished; they were talking about compensation for every game they'd make in the future with that publisher.
> Their CEO Ray Kassar famously said, "You’re no more important to those projects than the person on the assembly line who put them together. You’re a dime a dozen. You’re not unique. Anybody can do a cartridge.".
Imagine any software company CEO nowadays saying that out loud, no matter what they privately thought.
> Imagine any software company CEO nowadays saying that out loud, no matter what they privately thought.
A daughter of my friend was not very happy in her job: a Silicon Valley company hired her as a security pro, but was using as a coder, which she hates. She was going to leave, but decided to wait ten months or so until her stock options vested. She and a big group of other engineers were fired right before the vesting moment.
All this time the CEO was generating absolutely politically correct sounds: people are our best capital, diversity is our strength, etc. She would be better off if he was honest.
> She was going to leave, but decided to wait ten months or so until her stock options vested. She and a big group of other engineers were fired right before the vesting moment.
I'm not an expert in the Silicon Valley ethos, but to me it sounds that both your friend's daughter and the company were playing the same game: trying to extract the most value from the other party without actually having a long term commitment. I suppose she was not going around saying how much she hated the company and that she would have left as soon as it was convenient.
The company had the upper hand, but can she really complain?
You are entitled to hate your job and still do what you're being asked. For lack of knowledge about more facts we should assume the friend was doing the job.
> The company had the upper hand, but can she really complain?
She is a bright girl, so she is not actually complaining, she knew the risks and trade offs. It was her first job after a college, btw. It’s me who is a bit bothered by her story. You see:
1. She was hired to improve diversity targets (her estimate).
2. After she was hired, her brains were ignored - a rather painful situation for a person with brains.
Would have this bright girl been better of if we as society put less pressure on companies to hire girls?
> All this time the CEO was generating absolutely politically correct sounds: people are our best capital, diversity is our strength, etc. She would be better off if he was honest.
Honestly rarely pays and is also unthankful. There is only a little benefit and lots of downsides, such as people getting seriously pissed at you. It is not a wonder that corporate leadership roles are filled up with people who see no problem of talking bullshit all day.
They meant honesty from the CEO would be good for her, not the CEO. Because she could have found another job instead of waiting ten months to be fired.
> Imagine any software company CEO nowadays saying that out loud, no matter what they privately thought.
Comarch CEO famous quote: "any developer could be replaced with finite number of interns"
This is Polish software company (quite big, one of the biggest) and since this quote went public, they don't have best reputation among developers. You go to work there only if you are actually intern fresh after uni.
I don't think there are any online sources. I don't think I've met a polish coder that wanted to work for them. Some companies won't even hire a candidate that spent more than a year at Comarch (they would argue - if a candidate could withstand that company for that long there must be something wrong with them).
My mom, who's a programmer, once worked for Computer Sciences Corporation, which she jokingly referred to as "the whorehouse of the computer industry".
But she was just being charitable, because they were into so much more that just that!
>The company has been accused of breaching human rights by arranging several illegal rendition flights for the CIA between 2003 and 2006, which also has led to criticism of shareholders of the company, including the governments of Norway and Britain.
>The company has engaged in a number of activities that have resulted in legal action against it. These are:
>Its so called WorldBridge Service (Visa Services), which processed and issued millions of visa applications to enter Britain, did not involve British authorities.
>CSC was one of the contractors hired by the Internal Revenue Service to modernize its tax-filing system. They told the IRS it would meet a January 2006 deadline, but failed to do so, leaving the IRS with no system capable of detecting fraud. Its failure to meet the delivery deadline for developing an automated refund fraud detection system cost the IRS between US$200 million and US$300 million.
>- if a candidate could withstand that company for that long there must be something wrong with them
Or just being exposed to a architecturally dysfunctional organization breeds negative behaviour and mentality.
Privately I consider this kind of behaviour espoused by executive management to clearly qualify as harmful to society surpassing criminal threshold. This needs concerted study however to form the basis of anything more than a grizzled opinion.
I feel like software developers have either the biggest or the smallest egos; the majority is in the last category and will keep their head down. But in companies, all the managerial staff - including these days scrum masters, which is now a dedicated role by a non-developer - will strut around like they own the place and everything would fall apart if it wasn't for them.
I agree 100% with much of the sentiment here that creators deserve respect and a cut of the spoils
At the same time, it really depends.... I'm pretty confident that there are very few engineers at FAANG that can't be replaced. I'd also expect there are very few engineers at Epic, Activision, Blizzard, Naughty Dog, Sony, Rockstar, Ubisoft, Valve, etc... that can't be replaced. Sure, if 30%-70% of the team left on any particular project it would probably die, but at least for AAA titles, there's usually no one person responsible for that title's success? Or maybe there is but it's limited to a few key people and not every person on that team.
IF you're at some indie firm with 5-15 people that's probably less true.
I mostly made this comment because in 1983 most games were made by 1 to 3 people max. By the end of the 80s there weren't many games that had more than 20 people on them and usually they took less than a year to make. It was only in the mid 90s that we started getting 30+ people teams trying to fill a CD with data and it arguably wasn't until the 2000s that we had games that it was common to teams of 30-100+ people multiple years to make.
If you take the truly non-performing folks, well sad to say you can probably get rid of most of them and improve performance.
Otherwise, get rid of any engineer and the minimum impact is 3-6 months code and culture familiarisation before they get up to speed with your particular application/code base/equipment. Can easily be more than a year - especially with some of the big systems.
So yes there is an impact on business performance, and a highly damaging one, far more often than is realised. Companies compete - and companies go under and get replaced, all the time.
>get rid of any engineer and the minimum impact is 3-6 months code and culture familiarisation
On a big project, say 100 devs over 3 years, 6 man months is 1/600th of the work, so a single person is replaceable and it's not even noise. If the replacement takes 6 months to get up to speed the replacement is certainly not a very good developer, even on the biggest projects. At that size, there's lots of small side projects, testing groups, and the like, so there should be plenty of smaller pieces to work on, and some on those small projects are always happy to jump into the main work, not needing 3-6 months to be useful to it.
This is not highly damaging on any but the smallest, shortest projects. And even there people move around all the time and don't destroy projects.
Often the person leaving has not been that good of a contributor due to wanting to leave, while a replacement is new and likely more inclined to work hard.
On a big team, people fit a bell curve, and most likely those leaving are not going to cause much harm (otherwise no big project would get done - all have people coming and going over the lifespan of the codebase).
> I'm pretty confident that there are very few engineers at FAANG that can't be replaced. I'd also expect there are very few engineers at Epic, Activision, Blizzard, Naughty Dog, Sony, Rockstar, Ubisoft, Valve, etc... that can't be replaced. Sure, if 30%-70% of the team left on any particular project it would probably die
The thing is these two things are linked - one engineer leaving and 30-70% of a team leaving. The quantity of who leaves does not matter as much - a project may be able to handle 70% of consultants, interns, junior and regular programmers leaving, but might die if the 30% (or 25%, or 20%) leaving is entirely senior/staff/principal.
One of the lead programmers who has been at the company for many years leaves. He is friendly with some of the other senior programmers and says he thinks the company is slowly going downhill, and he got a new job with better money, and with a saner schedule, work environment and work-life balance. Maybe one of the other senior people leaves for the company he left for. Then other senior people start leaving.
It's like Steve Blank's essay about how a company deciding to start charging fifty cents for soda led to an exodus of its best senior programmers. One lead leaving can be a catalyst for others leaving. So they are in a sense irreplaceable.
If a company is an oligopoly like Verizon/AT&T or the like, then they are privy to revenue and profits they don't have to compete for, and for companies in that situation people are more replaceable. Not for companies that have to be competitive though.
Most developers working on products, not just at FAANG, can be replaced but it's very costly.
You need to find a developer in the specific niche he was competent. Not easy because there is a shortage of developers. Then he needs to get up to speed with the stack and the processes used in the company.
So in theory yes, engineers can be replaced. In practice it's costly, with no guarantee of getting the same productivity, and the process to find someone will leave you with one person less for many months. When you have competent engineers that you want to keep, the last thing to do is to play the "I don't need you anyway" card.
Blizzard lost their entire RTS staff to Frost Giant and will likely never release another RTS again. Sure they can probably survive on lootboxes for Hearthstone and Overwatch, but it's not the company it was. Even WoW is basically dead and is just rehashing with classic. Diablo got turned into mobile lootbox garbage as well, and it's too early to tell if Diablo 4 is going to follow the same path, but I wouldn't hold out hope on it being a critical success.
Large game companies have now essentially become casinos.
Video game studios taking advantage of young, naive people who will burn themselves out working unpaid overtime for the privilege of working for a game studio? I'm absolutely shocked, shocked I tell you!
Some of them probably get 50 applicants for every position they advertise. It's a meat grinder.
Video game developers (and working class in general) desperately needs to unionize, young folks are too naive to realize they're working towards chronic health conditions.
My sister is a vet. She works in a normal vet practice in our city dealing with mainly cats and dogs. Once I asked her why she doesn't work as a vet in the zoo, an extremely prestigious and wealthy institution with zoological research, wide variety of animals, etc. She just said that all young vets want to be a vet in the zoo. So they have much worse pay and conditions than other vet jobs.
Video game developers don't have terrible conditions and relatively low pay because of some anomalous lack of bargaining power which can be fixed by unionization. They have lots of bargaining power, most of which they use to choose the industry they work in. There are lots of young men who want to work in games, and far fewer who want to work in financial software. So pay and conditions are far worse in games, to the point where supply meets demand in each type of development work.
In a sense, some do - by quitting and/or going indie. There's some good studios out there; Team Cherry (hollow knight) famously doesn't do crunch. They also don't (need to) make any announcements about games until they're ready.
I'd argue that's part of what makes it work. You can get ten people into one meeting room (or one zoom call) and still be able to talk to each other clearly.
Many game devs are fungible cogs, implementing a well-defined blueprint. Especially Activision games like Call of Duty. It doesn’t help that there’s just so many devs these days that they can seemingly abuse them for decades and nothing has collapsed.
Devs are definitely no fungible, the difference in the productivity, team moral and new bug introduced by just changing one person in the team can be huge.
Even simple boilerplate is done differently by people. Some will automate them, some will do them manually forever. Some will naturally organize to discuss how to limit or improve them, some will stay with the status quo ad vitam. Some will document how to do things, provide templates to limit mistakes and mentor new comers. Some will just do their job.
And that's not even touching the fact some are simply bad at what they do.
In all the successful projects I've seen, hiring the right people or replacing the one leaving were critical processes, not just swapping.
This idea you foster is probably half the reason 2/3 of IT projects fail.
Big games will have a small core team - engine programmers, creative leads, etc - and a large section of 'grunt work' - modeling, texturing, animation, etc. The kind of thing someone puts on a very long todo list to be picked up. That's likely more work where individual contributions become less important.
It's also an area where there's more and more outsourcing happening these days.
Try not to take it personally, there are both devs who are replaceable and devs who aren’t, and with ~40 Call of Duty titles on almost as many platforms, a million and one people have worked on it, some doing more mechanical port work than others. There’s truth in there; the games industry is tough, and it’s relevant that some studios that (for example) demand lots of overtime haven’t seen any large exodus, or sometimes there is high turnover and the studio still survives, to parent’s point. There’s a higher level layer to this, that from a publisher’s point of view, there are a lot of smaller studios that are easily replaceable, and I’d speculate studios go out of business over contracts lost to other studios far more often than over employee walkouts (which of course fuels the need for overtime to be competitive). This is true for games and for VFX production in the US, enough people want these jobs that high turnover doesn’t seem to slow the business.
Well for what it's worth my friend is still with them, or at least what you could call the main descendant of that team. They don't treat him as a cog. In fact he was their first full time remote employee as I understand it, as he got sick of living in Tulsa. No offense Tulsans, but when you've lived in the PNW for a while it's kinda hard to give up all the trees, mountains, etc. I do miss thunderstorms though.
That is a great story, and should be an inspiration to aspiring gave devs. As you can clearly see in the thread, I did not bring up CoD. Two comments above me were discussing it. I was just pointing out that it’s now a huge franchise. All the franchises, large and small, have cycled through many, many programmers and artists and designers. It does not disrespect your friend to point out that there are multiple studios he didn’t start that are now developing CoD, or to point out that it has been ported to so many platforms that there has been a metric ton of unsexy porting work alongside the original content work. Having worked on both game and movie franchises, I can safely say that there’s less room for individual input. Not none, just less. I’ve witnessed whole studios (both in games and films) push and push to work on an original non-franchise production, because everyone knew it’d be more fun and felt less like being a cog. The fact that your friend made a wildly successful franchise is absolutely great for him, and for his business, but you can’t claim that it’s creatively great for everyone else involved, even if it does support them financially.
> Stop perpetuating shit you read on some gamer forum
Whoa brother, maybe cool your jets and don’t make hasty assumptions. I was a lead game dev for a decade. I didn’t work on Call of Duty, but I did work on some large titles, and I did put in two decades of work hours in one decade. The studio I worked at was always perpetually on the edge of closing, and that was used to push people to work harder.
I also worked another decade in VFX too, and saw the same things. In the mean time, both studios actually did suffer closures for the reasons I cited, and they were both replaced by other studios. If you have experience to share and not just logic, I’d love to hear more about it. Otherwise, I’d encourage being careful about making assumptions. The business world is already tricky and exploitative, tearing people down who point it out doesn’t solve much.
Hey thanks! I actually never even thought of writing it up outside of here, I’ll have to let that thought steep for a while. I give talks to college game dev and CS classes every now and then, and there are quite a few tidbits over the years I’ve added in comments here on HN. Maybe someday I could smoosh it together into something coherent. I’m just not sure it wouldn’t be crushingly boring. :P
you mean like Call of Duty: Modern Warfare Remastered? or Call of Duty: Modern Warfare from 2019 not to be confused with Call of Duty: Modern Warfare from 2007?
The reason I said this is because of "implementing a well-defined blueprint" it just compelled me to say something, I am not expert developer by any means,but it is like saying to a degree; paintings are all same because most of them use same colors, arguably writing prototype code is simple, but fixing bugs and tuning everything so game is fun and fluid is where hard part is.
Games have seen incredible growth as an industry over the past couple decades. It works because everyone and their grandma wants to play CoD, not because what they're doing is sustainable. Let's see how they fare once the markets saturate
Clearly the situation is very different. Activision is much much bigger than Atari was. 4 of the top developers leaving wouldn't cause the company to implode.
The more I learn about game dev the less I think this is true. The number of programmers that can work on bleeding edge game engine tech is incredibly low and the learning curve has only gotten more severe over time.
I mean, I'm sure they wouldn't implode, but I bet they pay their core engine devs a more than decent wage.
A top developer leaving also functions as a signal to others in the company.
I once left a company quickly after a senior leader had left. That proved to be a good move since the company was going under and sold a few months after.
It’s justified though. Games are basically a solved problem these days, and the developers of today are most of the time just building on abstractions and best practices that didn’t exist during Atari’s time.
Once a particular domain of software becomes sufficiently mature, there is no real opportunity for heroic programmers to emerge who become too valuable to replace. Eventually more people emerge who are just as good.
I... have you played games at all recently? Have you seen the recent major releases? BF2042, Cyberpunk, etc. Even the highest quality game studios (not the aforementioned) have trouble making good high quality releases, especially with consistency.
people in the tech industry like to overestimate their own skills.
we see the same attitude in software industry - somethin' that has existed since the 60s that software is solved problem. yet everyone has difficulties in shippin' software that actually works whether that's titans like Apple / Microsoft to small mom and pop shops.
Games the difficulty is two-fold. 1: games are an art - and making art to good taste is a complex problem. 2. games are software - thereby suffering from problems encountered by the regular software industry etc lack of labour / resourcing etc
re: Cyberpunk, they tried to solve it again - over-estimating their own abilities to build a game engine (like they did with the Witcher games before) and ending up getting the basics wrong (e.g. resource loading on lower-end systems like the PS4).
Games SHOULD be a solved problem. There is no good reason for us to have to reinvent the wheel over and over. There are compsci white papers that neatly solve all of the big problems games run into.
But games are not a solved problem. There are multiple overlapping reasons why.
One is that gamedevs often just don't do the research. Why would they? The deadlines loom, the milestones have to be delivered, nobody cares if it's a hacky mess right now, surely management will give us time to fix it once they realize that it's broken--but if we don't deliver anything, the publisher cuts our funding.
This overlaps with scheduling and management issues. It turns out that writing good software takes up time[1], and the problem with games in particular is that they don't make money until they're released.
You don't write games like you write business software, where the other company paying for your milestones is the company that's going to use the software; that company usually has a revenue stream even without your software, so they don't have to care as much. For a game, though, there IS no revenue stream until the game launches. Every year that a game is in development without a release is a loss, and that pisses the board of directors right the fuck off, so that means the game needs to be out ASAP.
Because of this, games are often not given enough room in the development schedule to be made correctly. There's no time for research, testing, planning, or any of the other important parts of software engineering--we have to write this code NOW, or it doesn't ship. And if you read that source I linked in the footnote, you'll know that this produces rotten software.
This is compounded by the kind of one-upsmanship that is created by such an environment--leading to a phrase I've heard from friends in other companies: "Very optimistic people who are no longer with the company made this decision". You get into a situation where people made promises to impress the publisher, claiming that they can turn out a game in an impossible timeframe, and that got them fired--but now you're stuck cleaning up the mess, and the publisher has already wasted a lot of money on the years spent thinking it wasn't a mess.
Mix in the siloing of information (because all of this shit is proprietary) and, despite all of the problems being solved on paper, nobody's solved video games.
There's plenty of unsolved problems in gaming and that's usually where indies make their money. Games like Dwarf Fortress, Minecraft, Stardew Valley, Rimworld.
I'm genuinely curious, what money did Dwarf Fortress make? The game is very cool, but I think it's a bit unapproachable to the majority for it to make any amount of serious money like Minecraft, Stardew Valley and RinWorld did.
I could be out of loop, but the last time I played Dwarf Fortress, it still used terminal graphics and white I believe in gameplay > graphics, my brother and the majority don't and probably won't even touch the game. (Not to mention the _menues_)
Okay, maybe DF is a bad example. It seems to be roughly $15k/month on donations for two people now. It is coming on Steam with a major graphics and UX overhaul, so I guess we'll see eventually.
I very much doubt that. There's a reason why innovation in games often happen, entertainment is not an easy problem to solve, with no set quantity to achieve
I agree as well that technology wise it still isn't solved, but I think it is the creative side that will be the most unknown part of the project these days.
That is why we have so many bland but technically impressive games. Studios want safe bets, an FPS game is easy, making it interesting to play is still very hard.
Agreed, but a lot of industry work doesn't give you much leeway to be creative. You can say the same for a level designer who just has to implement pre-specified designs they had no hand in.
Decent amount of BF games flopped. Assassin's Creed had three "reboots" of the formula. Ubisoft recently realized they have to shake things up, since flops are more and more common.
it depends on what the poster meant by "games" - do they mean engine and graphics? Or do they mean game design/mechanics?
Game engines and graphics is "solved" if you stick to popular concepts (which are those that are easily available in commercial engines).
Game design/mechanics is an unsolved problem imho - unless you consider it solved when merely taking an existing game design (like an FPS) and add nothing new to it (aka, those yearly COD military shooters).
Consider a game like https://store.steampowered.com/app/1141580/Taiji/ (inspired by the witness). This game is quite unique, and cannot really be recycled in to another game (without it being just a clone).
'Solved'? - Take a look at some of the Lumen/Nanite tech in Unreal Engine 5. And that's just a small chunk of what's happening on the rendering side of things. Game tech continues to evolve at a fair pace.
atari very much does not exist, the trademark/name has passed through like 3 or 4 different hands now. whatever corporate entity is now calling itself atari has absolutely no relationship to the original.
No, the Atari name exists today. Famous defunct company names regularly get bought by entities wanting to cash out whatever goodwill or positive associations still left in them. Would-be buyers with less extractive ambitions can't compete in the auctions.
So no, Atari doesn't exist today, what exists is a company wearing Atari's skin as a suit, in order to fool you. Don't be fooled by names without the organisational continuity to back it up.
Better example is AKG which used to be well respected for making high-quality headphones, until Samsung bought their parent company and started just using their logo from 2017 on cheap crap.
The original engineers formed a different company Austrian Audio (no article on wiki apparently)
This was the beginning of an avalanche that led to most of the remaining engineers at Atari to leave and try to start their own third-party game companies to follow in the success of Activision.
Atari collapsed under its own weight because of some pretty profound mismanagement, which led to massive layoffs. (you mentioned in another comment that engineers “left in an avalanche.” The word is “layoff”) There were plenty of extremely capable designers and engineers (and researchers) who were let go in a very short period of time, people who had not gone off to do their own thing. Atari was big.
If anything, other companies making good games for their platform aided them, and they were famously bad at seeing that. Engineers leaving to develop Atari software didn’t move the needle in terms of their collapse. Not giving individual credit on games is a tiny footnote in the book of things Atari management did wrong.
Making money isn't about making the best product. Commodore proved that. It is about getting people to give you money... So yeah, maybe they are worth it.
It really depends on the company. It sounds like you were working at a B2B/enterprise company? But yeah, getting new business and/or funding is often a big part of the job. It doesn't meant they don't also make decisions - but I guess the bigger the company the fewer different roles the CEO is filling.
Schmooze is a bit of a demeaning word, isn't it? If the CEO is learning from customers and helping the organization react accordingly, that sounds very valuable.
But if schmoozing means their company gets a new big customer or their stock price goes up, they have provided value I guess. It's just not from what we consider "real" work.
That kind of demonstrates the importance and value of a good CEO though. A couple dumb decisions by an assembly worker won't kill a company. A few bad decisions by a CEO can.
> That kind of demonstrates the importance and value of a good CEO though. A couple dumb decisions by an assembly worker won't kill a company. A few bad decisions by a CEO can.
A few dumb (or malicious) decisions by an assembly worker can cause enormous damage and incur enormous cost.
I think most job salaries can be linked directly to how much of a difference the employee can make. A complete dud at McDonalds in the kitchen might cost a few thousand on average but the further you go up, the greater the damage or benefit potential gets.
David Graeber, of Bullshit Jobs fame, makes the distinction between service work and caring work. Doing things vs taking care of other humans.
Part of his thesis is that society undervalues the labor of caring workers, because they get so much "job satisfaction". Teachers and nurses are examples of caring work.
I lean towards Graeber's thesis, mostly because I haven't read any other explanation for this pathology, so Graeber wins by default.
> parents paying for private schools seem to be willing to pay that for their children
Exactly—people aren't willing to pay for someone else's education, but everyone needs an education, so teachers who care enough that they are willing to teach for minuscule pay wind up squeezed between societal apathy and societal need.
Except the bad CEO still gets a big salary, and a golden parachute when he leaves the sinking ship of the company he killed. And then proceed to get hired as CEO for the next company.
The effects of a CEO is often lagging. If an assembly line worker stops working, you see the results immediately. If a CEO stops working or does something very negative, you might see the effects years later.
You'd think that losing key staff would kill the company overnight, but even in this situation, it took 4 years between being doomed and actually dying.
i thought there was an industrywide videogame crash in the early 80s that was blamed on the atari consoles being too open. (everyone was making cartridges, even companies like ralston-purina, quality fell under the flooded market and consumers gave up).
my understanding is that this gave rise to nintendo's tight control over developer licensees while atari was sold off and pivoted to home computers (specifically the st line) under jack tramiel.
The story is more complex than that, Atari split in 2 (the games company and the consumer electronics company) and neither really exist anymore (and haven't for a long time now). But basically yes - they never replicated the success of the VCS and arcades stopped being a major part of the industry in the 90s. All of Atari's biggest hits were in the 70s and (early) 80s.