-
Posts
277 -
Joined
-
Last visited
-
Days Won
4
Kieran last won the day on September 11 2012
Kieran had the most liked content!
About Kieran
- Birthday 12/07/1996
Profile Information
-
Gender
Male
-
Location
Sydney (YSSY), Australia
Recent Profile Visitors
6,958 profile views
Kieran's Achievements
Newbie (1/14)
96
Reputation
-
Thanks deanbrh, I'll take that on board
-
What is the best computer setup for x-plane 10
Kieran replied to skyhawk386's topic in General Discussion
True it is not completely single threaded, but it is defiantly by far more limited by single threaded performance than the number of threads you have. Because as sqrt(-1) said, the rendering task is single threaded, which is the most time consuming process on the cpu per frame.- 9 replies
-
- new computer
- system setup
-
(and 1 more)
Tagged with:
-
Yeah indeed, should be interesting to see how much performance gains it delivers
-
Nano, Thanks for the advice, got my fps much smoother now, up around 60 fps. Yeah the 5960X isn't the most suitable CPU for X-Plane, but have you seen the newly released Intel CPUs?
-
Thanks for the advice, I've turned down the water reflection to default, but it doesn't change my fps unless I'm flying near water. My NVIDIA settings are as per default with the exception of "Power management mode" being set to "Prefer maximum performance". I don't have any power saving modes, and with a 1000W PSU I'm in the clear there for sure. Regarding cooling, that's also easily sufficient. The CPU temps are around 60 degrees, and GPU gets to 70 degrees (though my GPU has a silent fan mode, where the fans don't spin unless the temperatures are above 65 degrees I think it is).
-
I went back and changed a few settings around so now I'm getting better fps, and I'm testing with the Cirrus Jet. I'm still seeing CPU utilization on one core of 60-70%, and GPU utilization of 50-60%, but perhaps that's normal for X-Plane? Nano, benchmarks are usually run at stock speeds, so the 5960X running at a stock speed of 3.0 GHz won't perform as well as the 4.0 GHz 4790K, at the same speed however they'd run very similar seeing as they're built on the same Haswell architecture, so my overclock would bring it inline with a 4790K at stock speeds.
-
Yeah except at the moment I'm getting around 20 fps. I know I can turn down some settings and get more fps, but I'm just wondering if there is something slowing things down
-
Hi all, So I've recently upgraded to a new system (for reasons other than X-Plane), and have just gotten some spare time back so I've been getting back into things, but I'm a bit disappointed by the performance I'm getting. My specs are: - i7-5960X 8 core CPU (with hyperthreading disabled) @ 4.3 GHz (overclock) - 32 GB DDR4 2400 MHz RAM - ASUS x99 Deluxe - GTX 980 Ti 6 GB (overclocked to ~1450 MHz) - Samsung 850 Pro 512 GB SSD - Windows 10 (I haven't had any problems with any other games or benchmarks underperforming in Windows 10) - 1080p monitor I don't have a screenshot of my settings at the moment, but what I'm wondering about is that neither my CPU or GPU usage are near 100%. I know X-Plane is very single-threaded, but the highest percent usage on a core is hovering around 60-70%. Meanwhile my GPU usage is at 30-40%. Also I know that I'm nowhere near running out of 32 GB RAM or 6 GB VRAM. In X-Plane looking at the fps overlay, I'm seeing times of 0.061 for the frame, 0.056 for the CPU and 0.026 for the GPU. To me this tells me X-Plane is being bound by the performance of my CPU. So my question is, is there some other potential bottleneck in my system that's keeping it from using more of my CPU thread as it's only at 60-70%? Or is there some setting in X-Plane that is notorious for draining the CPU? It is possible that having been away from the scene for a few months I've missed some things, but I was hoping with a system like this to be able to run X-Plane pretty close to max specs. Regards, Kieran
-
Yeah sure that's fine, I followed a post on another forum (that some people here don't like to speak of, but let's not get into politics). I'll see if I can find a link for you, but by all means PM if you need a hand, or just reply here. EDIT: Here's the link. It's not as well explained as I remember it being, but it'll be a good basis.
-
I believe that the current versions of the script don't have import functionally.
-
I have a similar set up, except with a single GPU (a 780 Ti), and I run three instances of X-Plane on my computer. I have three separate copies, one for each screen, but everything but the "Output" within X-Plane's root directory are symbolically linked to the center copy. This means you don't need to have the massive size of X-Plane 3 times, and you only have to install aircraft and scenery once. You can then set up each instance separately, and will have to go into the network settings and change the ports and stuff, because each instance needs it's own port number. This set up means that you only need one copy of X-Plane, cause it's only one computer. But also gives you better control over the screen set up. For example the side monitors in my set up have an offset of the FOV of the center monitor, meaning it's drawing what you'd see should you move your head left or right, which is appropriate because my monitors curve around me. I'm not sure how the 2 GPUs would work exactly. What I think should work is running the two side monitors off one and then the main monitor off the other, in such a way as to share VRAM the best. You may have to change the primary display to one on the correct GPU when you start the three instances of X-Plane, from what I've read elsewhere X-Plane uses the primary monitor no matter where the screen it's been drawn on actually is. But this can easily be done in a .bat file that you use to launch all 3 instances with the click of a button.
-
Basically as tomcat357 said, no. The master machine will output it's visuals at the best frame rate it can, based on it's settings. It then outputs, over the network, the current state of the simulator. The slave machine then uses that data to draw it's frames, at the fastest rate it can based on it's settings. So the two machines are operating separately, based on their own rendering settings, at their own frame rate. That build you have there looks good though.
-
I'm using the ones by Ondrej Brinkel for Blender 2.69+, and it's the latest one from GitHub.
-
Haha I just noticed that, it wasn't like that before, the export script when a little crazy. I got it fixed now and worked out that if I added an "ATTR_manip_none" after each manipulator in the .obj it worked perfectly. The only issue left now is that graphical glitch, but it might just be that, a glitch. But thank you for your time Jim, much appreciated!