Is it time for a new way of making AAA games?

+
I play on PC so things are fine for me performance wise, but saw that the console releases have some issues.

So was wondering, maybe CDPR should have focused on releasing Cyberpunk for PC first, maybe together with the next gen version as well. But maybe having spend more time on polishing these versions and getting rid of bugs etc. And then afterwards have announced that they would release on consoles later, giving them more time to optimize it for them particular, so they would get a good or better experience as well?

I know it's about sales numbers, which are impressive, but still I have a feeling that it might have been a better choice, because that way they would probably not get as many reports of bugs and bad feedback.

Because as with most AAA games that are released today, it seems to be the norm that they come with a lot of bugs in them, and often its bugs that these companies must have known about before hand, floating weapons, T poses enemies etc. We saw it in Fallout 76, Avengers and so on as well. And my impression from gamers in general, is that we are starting to get really fed up with it.

So maybe it's time for these companies to start to reinventing how they release games so they are better at releasing them in a good state.

Let's imagine that CDPR had said that Cyberpunk would be released for PC the 10th december and consoles in about 6 month or so?

Would that have been better or worse?
 
It would be nice if games could be made like movies.

There is less time pressure because your movie can't become "outdated" - the only pressure is investors wanting their ROI. Cinema tech is mature and practically frozen in stone - we are still using 24 fps with film-level quality imagery being the standard. You can delay a film a year or two and it won't make a difference most of the time - barring time sensitive content.

With games on the other hand, you are force to chase the technological cutting edge - with developers trying to predict where the cutting edge will be at the projected launch date. If you miss the launch ... your game looks dated.

That said, we might be getting there though. Semiconductor wafer fabrication tech which has driven the progress of IT has kind of hit a brick wall. The cost and difficulty in getting to the next nodes are getting prohibitively expensive. This means less fabs can do them and those that can will charge a king's ransom - because of the setup cost and due to lack of competition.

TSMC is the only one capable of doing cutting edge processes now. Samsung latest's process isn't doing that good - NVidia's latest chips consume an insane amount of power. Intel is still bumbling along somehow - I wish them the best of luck. Global Foundries have exited the cutting edge process business. So ya ... we only got 3 fabs still working on the cutting edge and 2 of them might be wavering.

TL;DR - Given the state of technology today, games might end up being made like movies - a more sane process.
 
It would be nice if games could be made like movies.

There is less time pressure because your movie can't become "outdated" - the only pressure is investors wanting their ROI. Cinema tech is mature and practically frozen in stone - we are still using 24 fps with film-level quality imagery being the standard. You can delay a film a year or two and it won't make a difference most of the time - barring time sensitive content.

With games on the other hand, you are force to chase the technological cutting edge - with developers trying to predict where the cutting edge will be at the projected launch date. If you miss the launch ... your game looks dated.

That said, we might be getting there though. Semiconductor wafer fabrication tech which has driven the progress of IT has kind of hit a brick wall. The cost and difficulty in getting to the next nodes are getting prohibitively expensive. This means less fabs can do them and those that can will charge a king's ransom - because of the setup cost and due to lack of competition.

TSMC is the only one capable of doing cutting edge processes now. Samsung latest's process isn't doing that good - NVidia's latest chips consume an insane amount of power. Intel is still bumbling along somehow - I wish them the best of luck. Global Foundries have exited the cutting edge process business. So ya ... we only got 3 fabs still working on the cutting edge and 2 of them might be wavering.

TL;DR - Given the state of technology today, games might end up being made like movies - a more sane process.
Agree.

Looking at Cyberpunk at its fullest potential, RTX etc. Even just medium settings without RTX looks fairly decent. But it is a demanding game and a complexed one.

And can't help thinking that CDPR is standing at a crossroad, between the old consoles and the release of the new generation of consoles. And the old ones are a huge market, since most people haven't changed to the new generations yet so they don't really want to miss that market. But on the other hand their game, really seem to aim at newer generations rather than the old ones. So what do you do?

And personally if I had been CDPR, I would probably have excluded the oldest consoles and focused on PC and new gen consoles, to make sure that all players could experience the game in a good quality. It just seems like they now have to spend a lot of time and energy on optimizing the game for consoles, which are going to be replaced sooner or later anyway.

Maybe as a general rule, if companies can't deliver a solid 24-30 fps on the consoles then they should strongly consider excluding them. It's not like PC's where you can always upgrade a bit here and there and eventually you can probably hit a decent fps.
 
I would like to see the video game industry suffer another meltdown like in the late 80s culminating in a bunch of call of duty servers being buried in the desert. Only a cleansing through fire can save the state of 'AAA' games in my opinion.
Boy I bet your fun at parties o_O
 
if they delayed the console versions it would hurt sales and piss off share holders

and in all seriousness, console players are kinda foolish for wanting to play a game like this on 7 year old hardware lol

another meltdown like in the late 80s
that myth is wayyy over-exaggerated by youtubers and gaming journalists who just want to get a few extra clicks from easily made up headlines, I mean sure there were a lot of bad games made in the early 80s but its not like video games were absolutely hated at the time, plenty of arcades still existed and in my family particularly, my family had just purchased an atari in the early 80s and didnt even know video games existed before then

so yea that story is a little bit fictitious, or exaggerated, based on a single land fill full of old atari games and the AVGN making a big deal out of it lol
 
if they delayed the console versions it would hurt sales and piss off share holders

and in all seriousness, console players are kinda foolish for wanting to play a game like this on 7 year old hardware lol
I really don't think so.

If a company say that their game will run on it, what can people do? They can't test it beforehand and no videos or review copies were released. So they have to trust that they are given the right information.
 
So was wondering, maybe CDPR should have focused on releasing Cyberpunk for PC first

Yes. It would make things considerably easier. But of course, the flipside is that there are way more people with consoles than there are people with gaming PC rigs. However, Covid has changed things. Gaming rigs are on the rise again. And now, with AMD and Nvidia duking it out in the ring again, theres never been a better time to build a gaming rig.




 
If a company say that their game will run on it, what can people do?
if a company says shutdown your business and stay in your homes, what can you do

lol theres plenty you can do... its called learn how to think for yourself, companies dont rule you

advertisers do lie bro, its 2020
 
Massive RPG's should always be developed for PC first given their gargantuan development process and only then ported to current generation of console hardware, not the other way around. It is painfully obvious that lot of the problems this game has are because it just had to be developed and launched simultaneously for nine different hardware variations, with oldest of them being last-gen base consoles that were already 4 years behind mid-range PC's back when they were launched years ago.

Back in 2013 when the first teaser for this game was launched things like relatively affordable Ray Tracing for consumer hardware was just theoretical at best. The longer the development process takes the harder it becomes to scale the game for all those different variations of hardware that come and go during development. That is why games of this scale should be always developed for PC first, and only after the PC version is complete should they start to port for whatever console hardware is the most current at that time. PC hardware is scalable and modular, allowing for the development team to increase their specs during development and keep up with the technological progress much easier than developing something for 8 years and then trying to make it run smooth both on current hardware and something that is effectively decade old at that point.

It is very possible that by the time the next-gen upgrade for this game is near launch the console manufacturers have already announced or maybe even launched their respective pro-models, making CDPR leave horsepower on the table again because their development process just simply cant keep up with non-scalable console hardware with a game of this scope.
 
Not to excuse anyone, the old console launch is a disaster and to add insult to injury, they made sure to cover it all up, but It is not like you can possibly catch all the bugs in-house, especially in such a complex game like Cyberpunk 2077. I don't really see how that premise, that bugs will be there no matter what, can be changed here anyway, without some early access thing or a wide (closed) beta test. Burning the industry down won't change that as well. I guess some neural network program a.k.a A.I. has the potential to bring in some changes here though.
 
if they delayed the console versions it would hurt sales and piss off share holders

and in all seriousness, console players are kinda foolish for wanting to play a game like this on 7 year old hardware lol

that myth is wayyy over-exaggerated by youtubers and gaming journalists who just want to get a few extra clicks from easily made up headlines, I mean sure there were a lot of bad games made in the early 80s but its not like video games were absolutely hated at the time, plenty of arcades still existed and in my family particularly, my family had just purchased an atari in the early 80s and didnt even know video games existed before then

so yea that story is a little bit fictitious, or exaggerated, based on a single land fill full of old atari games and the AVGN making a big deal out of it lol

It was a massive market crash that is well documented by journalists, economists and historians. It had effects on the economy, multi-million dollar companies went bust. You owning an Atari doesn't negate that. Anyway that isn't really relevant I was using that example for poetic effect.
 
It sucks for people who play on the last gen of consoles. But even when looking at the min specs for PC it has ssd recommended in parentheses. A lot of problems for console has to do with pop in and loading, and I think this could be one of the reasons. No ssd. I'm just hoping they can find some way to make it better optimized for console.
 
The Series X version seems largely fine (so far) but I understand the story on base consoles is different.

I'm not convinced 'burn it all down' is the answer.

If a game isn't great it won't sell as many copies. Oh, hang on... :rolleyes:
 
It sucks for people who play on the last gen of consoles. But even when looking at the min specs for PC it has ssd recommended in parentheses. A lot of problems for console has to do with pop in and loading, and I think this could be one of the reasons. No ssd. I'm just hoping they can find some way to make it better optimized for console.

This is a excellent point, back when the game was originally announced to public SSD's had just started becoming more mainstream within PC's and M.2 drives had only just been introduced, being extremely rare and expensive back then. 8 years later and I now have the game installed on a 100€ M.2 drive, and the latest console generation just introduced M.2 for the first time within console space.
 
Shouldn't have bothered with trying to make it run on last gen consoles.

Their biggest crime was promising that they would, and then holding on to that promise. If they had just dropped oldgen consoles completely things wouldve been far easier for them. And by extension: to us.

Anyway, I expect the game to get much better shape over the 2021 at least as far as nextgen consoles and PC are concerned. The oldgen cant perform magic.
 
Top Bottom