Jump to content

i7 7700k 1080 ti = LOL ?


LorDKyuBy

Recommended Posts

HI guys , i have a custom water cooled rig  whit 1080 ti  strix and i7 700k    and i  have 120 fps , but most of the time they drop to 30-40-60 ....... i mean what ,  is my pc trash ? or the game optimization , i dont get it  how 2 800 £ pc cannot run this  , what can i do ?  sorry for my english

 

Link to comment
Share on other sites

Blame the scrub developers. Also blame their choice of engine. Unreal engine is trash when used in most open world/large map games as it either performs like trash across the board, or will not play properly all the time on even overkill hardware. However, there are stuff you can do to stop the random frame drops but it sort of defeats the purpose of a high end pc lol. I have a mid range pc and I dont experience frame drops below 60fps, while max fps are 110-120 easily on max gfx settings. R5 1600X and a RX 480 8GB with 16gb ram running the game fine for as long as I can remember.

If you haven't already, go into your graphics settings and look for the option for anti aliasing mode/settings. When you do, either set it to use application settings or enhance it. Either option lends anti aliasing control over to the application(game) instead of letting the graphics card do it, when the graphics card does it what happens is it clashes with the game's engine that also does the same thing for w.e. stupid reason and thus, u have fps drops at times when u shouldn't.

Another thing to try if you haven't already is to use the 64bit client, the 32bit client only allows a certain amt of ram to be used by the game(cant remember how much). The 64bit client will allow the game to bypass that limit and use more ram, increasing game stability(especially fps wise during fights). This performance increase is most noticeable on systems with overkill amounts of ram than systems with 8gb ram or lower.

Another thing to do is increase background fps to max cap, same as game fps(120). This is so when you're tabbing back into the game you dont have to wait for the fps to shoot back up or stabilise before you can start playing, this only lasts a few seconds but it gets annoying very quickly when you alt tab often.

There is also another thing that can help which which forces the game to use all cores but thats all I can say on that cuz reasons. Good luck.

Link to comment
Share on other sites

so is it at maximum capacity 120 i actualy have 100 120 fps on full details lvl 5  ,,, the problem is they drope to 30  and this is ridiculos loool ,  from graphic card seting i allready set it to maximum performace and vsync off ..... i really love t his  game ,,,, lol

 

Link to comment
Share on other sites

8 hours ago, LorDKyuBy said:

HI guys , i have a custom water cooled rig  whit 1080 ti  strix and i7 700k    and i  have 120 fps , but most of the time they drop to 30-40-60 ....... i mean what ,  is my pc trash ? or the game optimization , i dont get it  how 2 800 £ pc cannot run this  , what can i do ?  sorry for my english

 

The game is based on the Unreal Engine 3 which doesn't support GPGPU features like Nvidia PhysX for example. In addition, the Unreal Engine 3 is not made for games which contains a big amount of physics to be calculated, which is however the case in most MMORPGs.

That is due to massive player base who may meet all at once in one place of the game. Character models and effects will cause a huge load of physics which then has to be calculated from the CPU.

Since the UE3 doesn't support these GPGPU methods, you cannot set the game to calculate the physics over the GPU instead of the CPU either. The gain of such method would lead to performance gain, since GPUs have a lot more cores compared to CPUs being able to make a lot more instructions per second (these days GPUs can do thousands of instructions in a second). A CPU is very limited in this case since most CPUs for home-use are usually build with 2-4 cores, while GPUs (very much depending on models) have usually over 1536 cores. (Your 1080 Ti has 3584 cores). You may now imagine the difference in numbers is pretty much big.

 

So, it's not your PC which lacks performance, it's more like the game which doesn't run smooth at all due to its game-engine limits.

This is also the reason why it doesn't make sense to run a PC with too strong hardware, which is basically far too overpowered for most games - leading to waste of money for power you basically can't unleash.

 

Some people may ask then why did NCSOFT choose the Unreal Engine 3 for BnS then?
Well, I cannot answer this question. But I think they did because the Unreal Engine 3 is cheap. Creating an own game engine or licensing a better game-engine already available would be much more expensive.

6 hours ago, Katu Kat said:

Another thing to do is increase background fps to max cap, same as game fps(120). This is so when you're tabbing back into the game you dont have to wait for the fps to shoot back up or stabilise before you can start playing, this only lasts a few seconds but it gets annoying very quickly when you alt tab often.

If you increase background FPS, the game will cause higher load while being idle. I wouldn't recommended this option and it won't change anything in regards of performance gain either. Why do you even mention that?

1 hour ago, MissHaiko said:

Their 64 bit client is buggy since the last big patch. Use the 32 bit.

Frames per Second has nothing to do with the network latency.

Besides, ping is an icmp protocol type command and tool. Ping has nothing to do with network latency either.

Link to comment
Share on other sites

28 minutes ago, RAID said:

If you increase background FPS, the game will cause higher load while being idle. I wouldn't recommended this option and it won't change anything in regards of performance gain either. Why do you even mention that?

Read over what I said, here lemme help with the highlight: 

28 minutes ago, RAID said:

This is so when you're tabbing back into the game you dont have to wait for the fps to shoot back up or stabilise before you can start playing, this only lasts a few seconds but it gets annoying very quickly when you alt tab often.

And also yes it does give a higher load which in OPs case is negligible or laughable load at the very worst. Its a 1080ti running an mmo that came out in 2012 lol gimme a break. I'd understand if it was a midraged card or something and even then its not a problem, its also not a problem leaving it the way u say but I brought it up anyway because it still helps, especially if you tab or have to tab out of the game from time to time. When u tab back in you wont be able to run/sprint/do whatever with ease because the minimal fps is considerably lower than the maximum. Im sure u know what it looks like when u set it to 5fps and have to tab back in the game....the game lags like hell for a few seconds then resumes its usual fps rate.

As for them choosing unreal engine to use. Its cheaper, considerably cheaper than a proprietary game engine. The only time they pay epic i think is when they ship their finished game, which is less than 10% of whatever the game rakes in over whatever stipulated time frame. Also, I think if u read over all of what I said you'll notice I was blaming the engine from the get go, also the devs ofc but more so the engine. Ark evolved runs like trash on 1080p with a titan xp at maximum settings, it barely struggles to hit 60, drops to 40 in bush areas. Pubg is another sad mess, the only exceptions to the pattern I've mentioned from the get go are games like fortnite, uhh borderlands series etc, those games run "ok" not optimal but good enough. That being the game wasnt that intensive graphic wise and/or the devs knew how to do a better job with a crap engine.

With all that being said, the options i mentioned still helps increase the game performance a bit, if not overall performance at the very least a better experience some some scenarios people *often* find themselves in, mine being tabbing often for example.

Link to comment
Share on other sites

6 hours ago, RAID said:

The game is based on the Unreal Engine 3 which doesn't support GPGPU features like Nvidia PhysX for example. In addition, the Unreal Engine 3 is not made for games which contains a big amount of physics to be calculated, which is however the case in most MMORPGs.

That is due to massive player base who may meet all at once in one place of the game. Character models and effects will cause a huge load of physics which then has to be calculated from the CPU.

Since the UE3 doesn't support these GPGPU methods, you cannot set the game to calculate the physics over the GPU instead of the CPU either. The gain of such method would lead to performance gain, since GPUs have a lot more cores compared to CPUs being able to make a lot more instructions per second (these days GPUs can do thousands of instructions in a second). A CPU is very limited in this case since most CPUs for home-use are usually build with 2-4 cores, while GPUs (very much depending on models) have usually over 1536 cores. (Your 1080 Ti has 3584 cores). You may now imagine the difference in numbers is pretty much big.

 

So, it's not your PC which lacks performance, it's more like the game which doesn't run smooth at all due to its game-engine limits.

This is also the reason why it doesn't make sense to run a PC with too strong hardware, which is basically far too overpowered for most games - leading to waste of money for power you basically can't unleash.

 

Some people may ask then why did NCSOFT choose the Unreal Engine 3 for BnS then?
Well, I cannot answer this question. But I think they did because the Unreal Engine 3 is cheap. Creating an own game engine or licensing a better game-engine already available would be much more expensive.

If you increase background FPS, the game will cause higher load while being idle. I wouldn't recommended this option and it won't change anything in regards of performance gain either. Why do you even mention that?

Frames per Second has nothing to do with the network latency.

Besides, ping is an icmp protocol type command and tool. Ping has nothing to do with network latency either.

 

I can tell you why they use UE3, because it is an old game, its first version was up in 2012, which means they were working on that atleast since 2011, and at that time it was the "best" technology at that time, then the game have grown and got too heavy to that tech.

 

BTW ping is a reflex of network latency, you cant deny that, if the network have any problem ping spikes or lose packets and it's an tool used to monitor network all around the world.

Link to comment
Share on other sites

9 hours ago, caioc2 said:

 

I can tell you why they use UE3, because it is an old game, its first version was up in 2012, which means they were working on that at least since 2011, and at that time it was the "best" technology at that time, then the game have grown and got too heavy to that tech.

Far from the truth. They used UE because its cheap, was an already existing engine by itself and just needed playing around with to look like the famous artwork blade and soul is known for. Proprietary engines take a while to develop from scratch and cost more to further tweak and maintain, meaning they always need to have staff on hand that is familiar with it, to have staff with so much practical knowledge of an inhouse engine leaving or being rotated isnt good. That can happen on an engine thats open source though as more people will be familiar with it, takes less specialized devs, which means pay would be lower. Like I said from the get go, cheaper.

EPIC's own admission:  pH7LJAa.jpg

All those stuff in a proprietary engine takes much longer to obtain and costs more to do so anyway even if it didn't. Just to prove a point, there's proprietary engines that were used to make games that look just as good as blade and soul and im taking about games that came out before 2008 lol. Even then, there was still open engines that could have made the game run better but maybe not look as good(?) for example Unity and CryEngine, the latter being able to utilise multiple cores/threads by default btw.

Link to comment
Share on other sites

2 hours ago, Katu Kat said:

Far from the truth. They used UE because its cheap, was an already existing engine by itself and just needed playing around with to look like the famous artwork blade and soul is known for. Proprietary engines take a while to develop from scratch and cost more to further tweak and maintain, meaning they always need to have staff on hand that is familiar with it, to have staff with so much practical knowledge of an inhouse engine leaving or being rotated isnt good. That can happen on an engine thats open source though as more people will be familiar with it, takes less specialized devs, which means pay would be lower. Like I said from the get go, cheaper.

EPIC's own admission:  pH7LJAa.jpg

All those stuff in a proprietary engine takes much longer to obtain and costs more to do so anyway even if it didn't. Just to prove a point, there's proprietary engines that were used to make games that look just as good as blade and soul and im taking about games that came out before 2008 lol. Even then, there was still open engines that could have made the game run better but maybe not look as good(?) for example Unity and CryEngine, the latter being able to utilise multiple cores/threads by default btw.

I do not deny that it's cheap, but it doesnt make my statement false. It was indeed the best technology at that time. Various AA games were released with UE3, not only because it was "cheap" but because it have a good SDK and tech. No one today release a game with a proprietary engine if they are not going to use it in multiple games or sell it to other devs.

 

I haven't seen a MMO game from 2008 with graphics as good as BNS, if you  not talking about MMO then we have a long list of games also using UE3. 

 

About that "multiple cores/threads" this is an old issue being carried even today for most games, up to dx11 multiple cores couldnt be well utilized as with dx12. MMO's suffer a lot more from that because they have many players/objects around at the same time (CPU intensive), more than most offline games, and they dont have the money to craft a lag free game. One thing I can assure to you, they could have done better with UE3 as they could have done worse with unity and CryEngine, which also translates into your point, money.

Link to comment
Share on other sites

  • 4 weeks later...
On 18/12/2017 at 3:53 PM, caioc2 said:

I do not deny that it's cheap, but it doesnt make my statement false. It was indeed the best technology at that time. Various AA games were released with UE3, not only because it was "cheap" but because it have a good SDK and tech. No one today release a game with a proprietary engine if they are not going to use it in multiple games or sell it to other devs.

I don't deny that UE3 was a good game engine to start with, however it wasn't the optimal game engine to be used where many physics comes together which is the case usually in most MMORPGs due to the massive players you meet there.

 

On 18/12/2017 at 3:53 PM, caioc2 said:

About that "multiple cores/threads" this is an old issue being carried even today for most games, up to dx11 multiple cores couldnt be well utilized as with dx12. MMO's suffer a lot more from that because they have many players/objects around at the same time (CPU intensive), more than most offline games, and they dont have the money to craft a lag free game. One thing I can assure to you, they could have done better with UE3 as they could have done worse with unity and CryEngine, which also translates into your point, money.

The multiple core usage isn't limited from the game engine itself. This is a feature you add to your game by additional effort and rework.

In most cases it is not considered as useful or effective for games to support multiple cores, because most performance for games is gained from the graphics card, which is why you will see several games which don't benefit from multi-core CPUs.

 

Just a side-note. These days 4-Core CPUs are quite common. However you don't buy a multi-core CPU with more cores than that for gaming alone anyway.

Link to comment
Share on other sites

On 17.12.2017 at 11:03 AM, LorDKyuBy said:

HI guys , i have a custom water cooled rig  whit 1080 ti  strix and i7 700k    and i  have 120 fps , but most of the time they drop to 30-40-60 ....... i mean what ,  is my pc trash ? or the game optimization , i dont get it  how 2 800 £ pc cannot run this  , what can i do ?  sorry for my english

 

well i have a similar setup as yours, i7 7700K but OC, delidded with Custom Watercooling and a GTX 1070 whis also is OCd, except that i play with 144HZ and 3D Vision and i also Stream.

 

maybe you wanna turn down the Effects on Monster Attack Effects to 3 instead of 5 and you want to use the command useallcores which is included in BnSBuddy, dont know how the stance of NCWest is to that, but aslong you dont touch anything else than useallcore and no texture streaming you should be good i think.

To any Mod, if my point with BnS Buddy isnt allowed, than remove that line please.

 

also BnS doesnt run on True Fullscreen on default, it runs as borderless Window, you can also use the Windows command ALT+Enter to force the game to True Fullscreen, which gets your GTX more grip for the game.

 

but yes its true BnS is CPU heavy, i dont know if your i7 is at Stock Clock (default) but you can safely go for 4,8GHz with staying below 1,4V Core V, which might help stabilise your FPS, but always remember the Heat problem on your i7 isnt the Watercooling, its inside the CPU itself, right between the DIE and the IHS, its what we call Intel Toothpaste (shitty Thermalpaste) AMDs are soldered from DIE to IHS.

 

if you have any questions regarding the OC, just ask me ^^

Link to comment
Share on other sites

On 18.12.2017 at 2:31 AM, caioc2 said:

 

I can tell you why they use UE3, because it is an old game, its first version was up in 2012, which means they were working on that atleast since 2011, and at that time it was the "best" technology at that time, then the game have grown and got too heavy to that tech.

no it wasnt, the CryEngine 3 was out already, which was the best at that time.
https://en.wikipedia.org/wiki/CryEngine

 

actually NCSoft already had experiences with the first CryEngine, it was AION: Tower of Eternity.

 

but i agree with your comment that the UE3 was cheaper to buy a license for, since we all know that a CryEngine 3 License was expensive AF.

Link to comment
Share on other sites

On 17-12-2017 at 11:03 AM, LorDKyuBy said:

HI guys , i have a custom water cooled rig  whit 1080 ti  strix and i7 700k    and i  have 120 fps , but most of the time they drop to 30-40-60 ....... i mean what ,  is my pc trash ? or the game optimization , i dont get it  how 2 800 £ pc cannot run this  , what can i do ?  sorry for my english

 

Your mobo sucks , i had same issue  with a little cheaper pc my fps from 91 fix dropped to 20-30 changed mobo and i never had fps drops ever again and my pc is only 1100 € (also i have AMD)

 

Link to comment
Share on other sites

12 hours ago, Kodiak said:

no it wasnt, the CryEngine 3 was out already, which was the best at that time.
https://en.wikipedia.org/wiki/CryEngine

Well CryEngine isn't that much better than UE3. Because it still relied on heavily modded resources, which caused some fps drop when a player is nearby, even worse when from opposite faction. Now even the beefiest computers struggle with siege, due to the same cpu bottle neck BnS has. Sadly it's an outdated technology poorly optimized for nowadays. Let's hope some of the developers behind Direct X or Vulcan API may think of a workaround to force usage of all cores.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...