Cities Skylines 2 dev says it stuck to the original CS2 launch date despite expecting potential backlash, and details a patch to address performance issues.
How’s the frame rate? I saw some reports of 7-12fps from systems that kick the absolute shit out of my 2 year old gaming laptop and had flashbacks of wasting $50 on KSP2, which I still can’t play despite exceeding the minimum specs.
So I figured I’d wait to hear from people for a week or so this time instead of potentially wasting money on release day.
There’s a lot of entitled people who are upset because they kicked everything to ultra and yeah , that’s where that 7-12 fps is. Most people can’t fathom fiddling with the settings a bit and maybe lowering them.
The dev sent out a forum post on what settings are causing the biggest lag. I followed their advice and it is completely playable. I’m about 10 hours in and I’m loving it
Can you send that forum post? It would have been cool for Paradox to have put a link in their useless launcher, or the steam news, or in the launch announcement, or wherever else. My observation is that Volumetrics and Global Illumination make the game run like garbage, but with global illumination off entirely, the game looks flaaaaaaat.
I am a firm believer that if you have a bleeding edge system you are 100% entitled to playing stuff in max settings (at least in reasonable resolution). I don’t see the point in blaming the customers when there is clearly a faulty product here.
Just to clear things up I am definitely not one of those people with the bleeding edge system with my 3060.
Difference here is, Crysis had graphics never seen before. C:S2 on max settings is nothing groundbreaking, it doesn’t even have raytracing.
In this case there’s performance issues, not futuristic technologies.
100% a top of the line cpu and gpu should not have problems running the game on max settings. It’s so weird seeing everyone defend a game with terrible performance if you want to exercise any of the graphics options
I don’t have a dog in this fight but bleeding edge literally implies that unreliability is to be expected. That’s why it’s bleeding edge and not leading edge.
It’s not being pedantic; I’m not correcting their use of an incorrect word that doesn’t matter. There’s a pretty big distinction between leading edge and bleeding edge, especially when it comes to stated disappointment that a software or program isn’t as stable as expected.
No need to toss insults just to jump to the defense of someone in a pretty simple misunderstanding.
There isnt jack shit difference in the colloquial sense, except for the fact that one word people generally know, and the other people dont. If you were telling this to a native english speaker I wouldnt care, but to an ESL person I feel the need to step in and say “Yeah no, everyone will understand what you mean with the phrasing you chose, the person correcting you is being hyper literal”
No worries; that would be leading edge, which you’re probably correct in your original statement with that in mind.
Bleeding edge in English generally refers to day zero hardware, software, or services, in which mainstream support most likely doesn’t exist and it is generally anticipated that issues will be encountered.
A laptop with a 1660ti 6GB got me 20-25fps 1080p low to medium around 10- 20k population. But I turned nearly everything off except for level of detail. Turning off Vsync somehow made it run around 5fps faster.
I was getting 7 fps in the main menu before poking at the settings, but my VII is damaged due to a new faulty 1kW psu that suicide-bombed my machine. I’m amazed it works at all, tbh.
Ultra with 1080 and no motion blur (e: and no AA), I’m getting the same as I got in 1 on 1440 (25-30, also with half dead gpu). I have hope that the additional fixes will bring it on par with 1 for fps.
How’s the frame rate? I saw some reports of 7-12fps from systems that kick the absolute shit out of my 2 year old gaming laptop and had flashbacks of wasting $50 on KSP2, which I still can’t play despite exceeding the minimum specs. So I figured I’d wait to hear from people for a week or so this time instead of potentially wasting money on release day.
There’s a lot of entitled people who are upset because they kicked everything to ultra and yeah , that’s where that 7-12 fps is. Most people can’t fathom fiddling with the settings a bit and maybe lowering them.
The dev sent out a forum post on what settings are causing the biggest lag. I followed their advice and it is completely playable. I’m about 10 hours in and I’m loving it
Can you send that forum post? It would have been cool for Paradox to have put a link in their useless launcher, or the steam news, or in the launch announcement, or wherever else. My observation is that Volumetrics and Global Illumination make the game run like garbage, but with global illumination off entirely, the game looks flaaaaaaat.
https://store.steampowered.com/news/app/949230/view/3744239011016263869
Definitely sealion vibes from this comment 🤡
Definitely clown vibes from your whole profile, you little joke
Edit: this guy’s such a freak lol
That’s a big conclusion from such little info, so that says more about your clown ass lmao.
Also the belittling is quite funny, it implies that you see yourself as the lower person in the convo 😂
I am a firm believer that if you have a bleeding edge system you are 100% entitled to playing stuff in max settings (at least in reasonable resolution). I don’t see the point in blaming the customers when there is clearly a faulty product here.
Just to clear things up I am definitely not one of those people with the bleeding edge system with my 3060.
I feel like some games want to future proof, so I could understand how there are graphic modes which are not feasible with current hardware.
So youre not part of the “Can it run Crysis” where the game was essentially designed to run on hardware that didnt exist yet?
Difference here is, Crysis had graphics never seen before. C:S2 on max settings is nothing groundbreaking, it doesn’t even have raytracing. In this case there’s performance issues, not futuristic technologies.
100% a top of the line cpu and gpu should not have problems running the game on max settings. It’s so weird seeing everyone defend a game with terrible performance if you want to exercise any of the graphics options
I don’t have a dog in this fight but bleeding edge literally implies that unreliability is to be expected. That’s why it’s bleeding edge and not leading edge.
English is not my native language so I may have used the term wrongly, I meant “bleeding edge” as basically very high end.
Buddy is being pedantic, in casual use most people will use bleeding edge in exactly the same use case as you are using it.
It’s not being pedantic; I’m not correcting their use of an incorrect word that doesn’t matter. There’s a pretty big distinction between leading edge and bleeding edge, especially when it comes to stated disappointment that a software or program isn’t as stable as expected.
No need to toss insults just to jump to the defense of someone in a pretty simple misunderstanding.
There isnt jack shit difference in the colloquial sense, except for the fact that one word people generally know, and the other people dont. If you were telling this to a native english speaker I wouldnt care, but to an ESL person I feel the need to step in and say “Yeah no, everyone will understand what you mean with the phrasing you chose, the person correcting you is being hyper literal”
1- they didn’t mention being ESL until after the response, so congratulations on the foresight of other’s hindsight.
2- have a good night and stay blessed, bud.
No worries; that would be leading edge, which you’re probably correct in your original statement with that in mind.
Bleeding edge in English generally refers to day zero hardware, software, or services, in which mainstream support most likely doesn’t exist and it is generally anticipated that issues will be encountered.
I see, thanks for the clarification
In my experience turning off dof and global illumination takes you from 20 fps to 80 (1440p, rtx 3090)
A laptop with a 1660ti 6GB got me 20-25fps 1080p low to medium around 10- 20k population. But I turned nearly everything off except for level of detail. Turning off Vsync somehow made it run around 5fps faster.
Those are pretty much my exact same specs. Asus TUF gaming?
Mine is a Lenovo Legion I got 3 yrs ago.
I’m getting about 30 FPS with 2090
The key though as far as I can see is to make sure it’s on the SSD, that speed up level streaming pretty quickly.
Vsync also doesn’t work, but I haven’t had time to figure out if there’s a workaround.
I don’t know who’s reporting 12 frames per second but they must be running on a potato. It’s definitely not well optimised, but it’s not that bad.
I was getting 7 fps in the main menu before poking at the settings, but my VII is damaged due to a new faulty 1kW psu that suicide-bombed my machine. I’m amazed it works at all, tbh.
Ultra with 1080 and no motion blur (e: and no AA), I’m getting the same as I got in 1 on 1440 (25-30, also with half dead gpu). I have hope that the additional fixes will bring it on par with 1 for fps.