Jump to content

Recommended Posts

25 minutes ago, Vyxenne said:

As far as I know, this works in SSE the same way. I just switched mine to the GPU (my dearly beloved GTX970) to see what would happen. If you see a mushroom cloud on the horizon, don't try it. Lol.

sounds like you need dust your pc lol I do mine every 4 months when I can.

Link to comment
7 hours ago, Vyxenne said:

As far as I know, this works in SSE the same way. I just switched mine to the GPU (my dearly beloved GTX970) to see what would happen. If you see a mushroom cloud on the horizon, don't try it. Lol.

you get more fps or smp issue ? so gpu or cpu is better

gpu fight shadow and other shit to render ,i miss smp mcm that can config smp via mcm will be good , it not hard to write value in xml ??

Link to comment
9 hours ago, chevalierx said:

you get more fps or smp issue ? so gpu or cpu is better

probably it depends on your components

 

For example when I had a FX8320 CPU and a RX 560 GPU, it was ton better to make it render into the RX 560 than the CPU

Now that I have a Ryzen 5 2600X, the CPU is 100% better than the GPU in my case

Anyways if you have top in both (for example an i9 9900k and a RTX 2080ti), I would put always in the CPU, since it uses the cores/threads of the CPU for SMP

Link to comment
7 minutes ago, panchovix said:

probably it depends on your components

 

For example when I had a FX8320 CPU and a RX 560 GPU, it was ton better to make it render into the RX 560 than the CPU

Now that I have a Ryzen 5 2600X, the CPU is 100% better than the GPU in my case

Anyways if you have top in both (for example an i9 9900k and a RTX 2080ti), I would put always in the CPU, since it uses the cores/threads of the CPU for SMP

I suspect that it should always be set to CPU (default) regardless of your setup and the following are very important too.

 

Set iVSyncPresentInterval=0 under [Display] in Skyrimprefs.ini and set Vertical Sync to "Fast" for SSE in your Nvidia control panel.

Link to comment
16 minutes ago, panchovix said:

probably it depends on your components

 

For example when I had a FX8320 CPU and a RX 560 GPU, it was ton better to make it render into the RX 560 than the CPU

Now that I have a Ryzen 5 2600X, the CPU is 100% better than the GPU in my case

Anyways if you have top in both (for example an i9 9900k and a RTX 2080ti), I would put always in the CPU, since it uses the cores/threads of the CPU for SMP

i have i5 4460 and gtx970

i think gpu is powerful then cpu ?

Link to comment
15 minutes ago, chevalierx said:

i have i5 4460 and gtx970

i think gpu is powerful then cpu ?

Hmm your CPU has 4 cores/4 threads, I think maybe the GPU would run better but you would have to try honestly, remember

<enable>true</enable>
true=GPU
false=CPU

First try with true, if it runs good then leave it like that, if not try with false (it will use the CPU instead)

EDIT: if someone can remind me how much cores/threads were recommended for HDT-SMP, I think it was or 6 cores/6 threads, or 6 cores/12 threads

 

EDIT2: from Oldrim page, that HDT-SMP uses

 

CPU calculation speed

L3 cache speed and size

GPU shader cache size and speed

GPU bandwidth

Render call refresh rate and framerate  (these are not the same thing)

Frametime 

script latency in calling and executing behaviors (also directly tied to framerate because Bethesda)

I guess the 1st two are assumed if HDT-SMP is rendered in CPU, and the GPU shader/bandwitdth if is rendered in the GPU

Link to comment
10 hours ago, chevalierx said:

i have i5 4460 and gtx970

i think gpu is powerful then cpu ?

Depends on what settings you're running the game at, what mods you have, and what resolution/refresh rate your monitor is.  I have a 2080 ti, and my GPU is my bottleneck, so I run HDT on my CPU.

 

Press Ctrl+alt+del > Go to performance tab > leave open, and launch game > Go somewhere where your fps dips > look at performance monitor to see if cpu or gpu is maxed out.

Edit:  Anyone have or recommend any SMP XML files with reduced jiggle that doesn't make the tits look saggy or like jello?  Having a hard time finding a good one. :(

Link to comment
3 hours ago, freetheporn said:

Depends on what settings you're running the game at, what mods you have, and what resolution/refresh rate your monitor is.  I have a 2080 ti, and my GPU is my bottleneck, so I run HDT on my CPU.

 

Press Ctrl+alt+del > Go to performance tab > leave open, and launch game > Go somewhere where your fps dips > look at performance monitor to see if cpu or gpu is maxed out.

Edit:  Anyone have or recommend any SMP XML files with reduced jiggle that doesn't make the tits look saggy or like jello?  Having a hard time finding a good one. :(

use cbbe smp fro cbbe page in nexus or copy value from this

https://www.loverslab.com/files/file/3285-all-in-one-hdtskinnedmeshphysics-setup-20b-fomod/

it have natural

Link to comment
On 7/4/2019 at 12:53 PM, Yinkle said:

The reason this happens is that the outfit nif files have a nistringextra data node which references an xml file outwith the control of (and overriding) your defaultbbps.xml. This xml file includes breast/butt physics and therefore conflicts with your cbpc setup.

 

Some outfits however, do not include breast/butt physics in the custom xml file. This is why it works for some and not for others.

 

"Usually" outfits that have just a skirt shape with no reference body in the nif file will work ok but it depends entirely on the mod.

 

What Redswift said a few posts up is correct too.

 

Here is a mod that I helped create which includes an optional download (skirt with virtual body) that allows for SMP/CBPC to work in tandem on the outfit.

I normally don't bother though as it's a lot of work to separate the smp part from a "main body" clothing/armor item and more often than not would require reworking the esp file to add extra items as well.

 

@Vyxenne I didn't nod sagely but did smile at your mention! :P

That Nier Automata outfit has been deleted by Nexus.  If you want it, download it here -->

Nier Automata Armor - CBBE BodySlide (SMP - Physics).7z

Link to comment
19 minutes ago, CrysisWar1234 said:

That Nier Automata outfit has been deleted by Nexus.  If you want it, download it here -->

Yup some of his his stuff has been deleted on LL too, he is a blatant thief. I'm sorry I ever agreed to work with him.

Link to comment
55 minutes ago, Yinkle said:

Yup some of his his stuff has been deleted on LL too, he is a blatant thief. I'm sorry I ever agreed to work with him.

https://forums.nexusmods.com/index.php?app=forums&module=extras&section=boardrules I think it got taken down due to copyright.  I've seen it countless times before (like the BLESS and BDO armors.)

 

The posting of any copyrighted material, unless the copyright is owned by you or you have consent from the owner of the copyrighted material, is strictly prohibited on any Nexus site. This includes linking to sites that contain copyrighted material used without permission of the copyright holder. Legal inquiries regarding copyright infringement are taken very seriously on Nexus sites and we will work with any legal body to identify and bring to justice anyone who might use a Nexus site to share copyrighted material.

Link to comment

Hi guys,

 

I decided to test the gpu/cpu setting in configs.xml and it just throws an "unknown config" error in the log.
 

This was the configs.xml that I used:

 

<?xml version="1.0" encoding="utf-8"?>

<configs>
	<solver>
		<numIterations>16</numIterations>
		<groupIterations>16</groupIterations>
		<groupEnableMLCP>false</groupEnableMLCP>
		<erp>0.2</erp>
		<min-fps>60</min-fps>
	</solver>
	<opencl>
		<enable>true</enable>
	</opencl>
</configs>

 

I'm guessing that it's depreciated in SMP for SE.

Link to comment
22 hours ago, Yinkle said:

Hi guys,

 

I decided to test the gpu/cpu setting in configs.xml and it just throws an "unknown config" error in the log.
 

This was the configs.xml that I used:

 


<?xml version="1.0" encoding="utf-8"?>

<configs>
	<solver>
		<numIterations>16</numIterations>
		<groupIterations>16</groupIterations>
		<groupEnableMLCP>false</groupEnableMLCP>
		<erp>0.2</erp>
		<min-fps>60</min-fps>
	</solver>
	<opencl>
		<enable>true</enable>
	</opencl>
</configs>

 

I'm guessing that it's depreciated in SMP for SE.

Maybe try this:


<opencl>
	<!-- true: GPU - false: CPU -->
	<enable>true</enable>
	<platformID>0</platformID>
	<numQueue>16</numQueue>
</opencl>

I don't know how important those additional settings are, though they are included with the LE SMP, too.

Link to comment
14 hours ago, Mister X said:


<opencl>
	<!-- true: GPU - false: CPU -->
	<enable>true</enable>
	<platformID>0</platformID>
	<numQueue>16</numQueue>
</opencl>

I don't know how important those additional settings are, though they are included with the LE SMP, too.

They may be included in the configs.xml, but none of those switches are in the DLL. I have been looking for them in the past when i was disassembling the LE version of DLL to search for possible constraint switches (if SE DLL throws the same error then they are not implemented).

I believe that author used these config switches for an experimental build which he never released.

 

I have performed several tests with high poly triangle-triangle mesh collisions in the past and they allways ended by killing the CPU. I have tried different system configurations using nVidia, AMD cards and integrated Intel HDG, but it always ended in the same result when the collision occured. CPU usage 100%, GPU usage barely moved.

I thought that somethig might be wrong in my system so i tried to test my system with hashcat and all OpenCL devices were working as expected and i was able to observe GPU utilization when chose GPU as the OpenCL device.

 

Without sourcecode I can only speculate, but I believe the current versions of DLLs force CPU (CL_DEVICE_TYPE_CPU) to be used regardles of available GPU OpenCL devices in the system.

Link to comment
On 7/11/2019 at 11:40 AM, Yinkle said:

Hi guys,

 

I decided to test the gpu/cpu setting in configs.xml and it just throws an "unknown config" error in the log.
 

This was the configs.xml that I used:

 


<?xml version="1.0" encoding="utf-8"?>

<configs>
	<solver>
		<numIterations>16</numIterations>
		<groupIterations>16</groupIterations>
		<groupEnableMLCP>false</groupEnableMLCP>
		<erp>0.2</erp>
		<min-fps>60</min-fps>
	</solver>
	<opencl>
		<enable>true</enable>
	</opencl>
</configs>

 

I'm guessing that it's depreciated in SMP for SE.

I used this xml and did not see any errors, although I also did not see any difference in my fps, either...

<?xml version="1.0" encoding="utf-8"?>

<configs>
<!-- alleged to control where the calcs are done- false=CPU, true=GPU -->
	<enable>true</enable>
	<solver>
		<numIterations>16</numIterations>
		<groupIterations>16</groupIterations>
		<groupEnableMLCP>true</groupEnableMLCP>
		<erp>0.2</erp>
		<min-fps>60</min-fps>
	</solver>
</configs>

Looking at your xml, it would seem that mine needs some tags around the <enable></enable> declaration... however, other than changing false to true, I have not edited this configs.xml at all. So now I'm wondering where I got it and if I need to replace it...

 

But then, according to OrrieL, it doesn't matter what's in it because nothing there works anyway... or the CPU/GPU OpenCL switch doesn't work... it's unclear from the above series of posts what parts of Configs.xml (if any) are actually required and useful. :classic_blink: However, as a former ERPer in a couple of games, I was hoping the

<erp>0.2</erp>

tag would be fun, and what was the setting I could use to maximize the effects... :classic_biggrin:

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. For more information, see our Privacy Policy & Terms of Use