Sunday, June 5, 2011

A Strange Occurence

A few days ago, I had something bizarre happen.  My video card began to show rendering glitches in the Imprudence viewer (a 3rd party game client for Second Life), where the vertices on objects and avatars began being distorted into massive planes.  So, I shut down Imprudence.

I knew what caused the distortions, of course.  It was GPU overclocking that I had forgotten to turn off after it provided only marginal benefit.  However, after turning it off and restarting Imprudence, my rendering performance had dropped by at least half.  So I just rebooted.

After rebooting, performance was at its normal levels.  However, I noticed something strange right away.  The viewer was using up half the normal amount of CPU usage it was using, but without any gain or loss in performance.  I thought this was strange, even though I liked the idea of lower CPU use.

The next day, I figured out what had happened.  On bootup, I'm shown a choice of which Linux kernel I would like to boot up with, and its a choice between the stock kernel shipped with Arch Linux, a kernel with the -CK patch set, and a fallback version of each kernel.  By default, it boots with the -CK kernel, but I had been manually choosing the stock kernel for the past few reboots due to stability issues I was having.

This time, I had forgotten about the issues and it booted into the -CK kernel.  I thought this was strange, however, because I never saw this in the -CK kernel before.  So, I rebooted into the normal kernel, and saw that CPU usage was back up to its normal levels.  I rebooted again and switched to the -CK kernel, and the CPU usage of the Imprudence viewer was back down to half the normal amount.

The only reason I can think of as to why I never noticed this before is because I had been using cpufrequtils, a Linux utility for CPU Frequency Scaling.  What this does is it adaptively underclocks the system CPU during periods of low CPU usage, in order to reduce electricity usage and how much heat the system produces.  Very handy on laptops, but also good for desktops.

However, I am not using cpufrequtils now because I had believed they were also behind the stability issues.  Though, I now know that the issues were with the nVidia driver not wanting to play nice with xorg 1.10.

So, after doing some thinking, and reading up on the -CK kernel's BFS scheduler, I came to the conclusion that the reduced CPU usage was from BFS eliminating idle CPU time, but I lack the knowledge to confirm it. If anyone knowledgeable about BFS or the viewer code could tell me what is causing this, I'd appreciate it.

No comments:

Post a Comment