Wednesday, August 31, 2011

New Blog

Due to the nymwars, I have decided to move my blog to posterous, which conveniently has a tool to import my entire blog over from blogspot & blogger.

The URL for my new blog is at http://zauberparacelsus.posterous.com/

Thursday, July 21, 2011

Lighting Tricks


With the Second Life viewer, dynamic lighting is done by using "vertex lighting".  If you enable wireframe mode (CTRL-SHIFT-R), you'll see the vertex points which make up an object, as spots where the wireframe lines intersect.

Now, vertex lighting has some problems in the area of accuracy.  It is done by calculating the brightness of a light based on the lights parameters (intensity, radius, falloff), and then applying a gradient to the prim surface.  With smaller objects, this isn't a problem.  Larger objects, such as megaprims, will have a problem.

With megaprims, the vertex points are spread much further apart.  If a light with a smaller radius is too far from one of these points, it will produce little or no light.  Meanwhile, if it is close to a vertex point, it'll produce too much light.  The white light in the snapshot below has a radius of just one meter.  But because it is right on top of a vertex point, its lighting radius is far greater than one meter!





Now, take a look at the two snapshots below.  In the before snapshot, you can see that 1) the torch on the left is casting a lot of light onto the floor, while the one on the right is casting none.  The four torches in the background are casting little or no light against the floor.  So, what's going on?




The lefthand torch is right near a vertex point, over-representing its light.  The torch on the righthand side is centered between two different vertex points, under-representing its light.  On the After image, the lights are all emitting light at the desired levels.  What is difference between the two versions?

The lefthand image has a megaprim cube for the floor.  In the second image, the floor is represented by a sculpty in the shape of a flat square.  The cube prim has a vertex resolution of 4x4 per face.  The flat square sculpty, however, has a resolution of 32x32 vertices.  Because of this, the vertex points are much closer together, allowing for a more even level of lighting.

The disadvantage of this method, however, is that the floor has 64x as many vertex points (16 points vs 1024).  So, this method will incur a higher rendering cost.  This isn't much of a problem if you use this method sparingly, but using it all over the place will result in more lag.

Below is a copy of the sculpt map that I used to create the floor with.  I'm releasing it into the public domain, so feel free to use it however you wish :-)


Flexible Meshes: The Sword That Cuts Both Ways

So, Linden Lab has decided to cut out flexible meshes, due to lag concerns.  I think Linden Lab is wrong on this.  Because, while it would mean more lag, it would also mean an opportunity to reduce it, by replacing the older flexiprims with a more efficient design!

Flexiprims have always been a major source of lag, and they are most commonly used for the creation of dresses, skirts, and hair.  But, if you were to create a skirt from a mesh and use it as clothing, you could reduce the lag generated by the skirt for these reasons:
  • A flexiprim skirt uses a lot of flexiprims.  One friend was wearing one with 18 prims, while another was wearing one using 51 prims!
  • Skirts use flexiprim cylinders.  Each cylinder uses 98 vertices at maximum LOD, and 14 vertices at minimum.
  • By making them flexible, their LOD increases a lot.  At minimum, a flexible mesh will have 56 vertices.  At maximum LOD, it will have 216 vertices.  And if you make them hollow, then the vertex count will roughly double.
As you can see, flexible prims on their own are rather inefficient.  A few of them scattered about won't do much to affect lag, but they will increase lag when there are ton of them all over.

So, I've made the point that flexible objects are laggy.  Why would sculpted meshes allow you to reduce lag?

The answer is simple: optimization.  A skirt maker can create a flexible mesh skirt as one single object, but rather than having the vertex count of a few dozen flexible prims, it would have a vertex count equivalent to just 4-8 flexible prims, depending on how ornate it is.  Mesh hair would also benefit from such improvements to LOD.

There will, of course, be people who would overuse this functionality and mesh itself, resulting in heavy lag-inducing objects with lots of needless and unnoticeable detail.  My personal hope is that such "detail gluttons" will receive a very quick and very harsh lesson for their mistakes.

Sunday, June 5, 2011

A Strange Occurence

A few days ago, I had something bizarre happen.  My video card began to show rendering glitches in the Imprudence viewer (a 3rd party game client for Second Life), where the vertices on objects and avatars began being distorted into massive planes.  So, I shut down Imprudence.

I knew what caused the distortions, of course.  It was GPU overclocking that I had forgotten to turn off after it provided only marginal benefit.  However, after turning it off and restarting Imprudence, my rendering performance had dropped by at least half.  So I just rebooted.

After rebooting, performance was at its normal levels.  However, I noticed something strange right away.  The viewer was using up half the normal amount of CPU usage it was using, but without any gain or loss in performance.  I thought this was strange, even though I liked the idea of lower CPU use.

The next day, I figured out what had happened.  On bootup, I'm shown a choice of which Linux kernel I would like to boot up with, and its a choice between the stock kernel shipped with Arch Linux, a kernel with the -CK patch set, and a fallback version of each kernel.  By default, it boots with the -CK kernel, but I had been manually choosing the stock kernel for the past few reboots due to stability issues I was having.

This time, I had forgotten about the issues and it booted into the -CK kernel.  I thought this was strange, however, because I never saw this in the -CK kernel before.  So, I rebooted into the normal kernel, and saw that CPU usage was back up to its normal levels.  I rebooted again and switched to the -CK kernel, and the CPU usage of the Imprudence viewer was back down to half the normal amount.

The only reason I can think of as to why I never noticed this before is because I had been using cpufrequtils, a Linux utility for CPU Frequency Scaling.  What this does is it adaptively underclocks the system CPU during periods of low CPU usage, in order to reduce electricity usage and how much heat the system produces.  Very handy on laptops, but also good for desktops.

However, I am not using cpufrequtils now because I had believed they were also behind the stability issues.  Though, I now know that the issues were with the nVidia driver not wanting to play nice with xorg 1.10.

So, after doing some thinking, and reading up on the -CK kernel's BFS scheduler, I came to the conclusion that the reduced CPU usage was from BFS eliminating idle CPU time, but I lack the knowledge to confirm it. If anyone knowledgeable about BFS or the viewer code could tell me what is causing this, I'd appreciate it.

Sunday, May 1, 2011

What is Explicit Typecasting?

One of the basics of scripting and programming in general that you'll learn is variables: a nametag for a piece of data that you can store, modify, and use.  Some of you may be familiar with the "box" example, where a variable is a labeled box with a piece of paper containing the information it has.


Now, variables can hold different kinds of information, or data types.  Under LSL, there's seven data types:

  • Integer: A whole number, such as 42.
  • Float: A non-whole or decimal-point number, such as 2.5, 1.25, or 3.141592654
  • String: Text data, such as "Hello, world!"
  • Key: A special kind of string, used to reference the data for avatars, inventory, objects, textures, and just about everything.  They are also known as a UUID, short for Universally Unique ID.  This is the key for the default plywood texture: "89556747-24cb-43ed-920b-47caed15465f"
  • Vector: A series of three float values stored in the same variable.  Used for colors, Euler rotations, and positions.  Example: <1.2, 2.5, 8.6>.
  • Rotation: A series of four float values stored in the same variable.  Used for quaternion rotations.
  • List: A special variable that allows you to store many other variables of any other kind under one variable.  Example:  ["hello", "world", 3.141, 42]
However, sometimes one variable may need to be used with another variable of a different type.  So, what happens?  What happens is Type Casting, where you convert one variable type to another.  There's two different ways to do this: implicit typecasting, and explicit typecasting.

What's the difference?  Implicit typecasting is where the variable type is converted to another type automatically.  Explicit typecasting is where you give an explicit instruction for the conversion.  So, if you casted an integer to a float, it would go from being 42 to 42.00.  And if you casted that same integer to a string, it would become "42".

Under the old, buggy script engine from the original opensim code, implicit typecasting was allowed all over the place.  This is problematic, however, because it allowed for sloppier code and is not meant for use in programming languages that have strong typing, such as LSL.  So, under the new Phlox script engine at InWorldz, explicit typecasting is now being enforced, bringing it in line with Linden Lab's implementation.

This means that some scripts may break, if they were written in Opensim or InWorldz.  But scripts written in Second Life won't break as a result of this, because explicit typecasting is already enforced.

However, the fix is rather easy: you just have to do explicit typecasting.  Say, if you want to convert 3.14 to a string.

//This won't work under Phlox:
float pi = 3.14;
string text = pi;

//This will work under Phlox:
float pi = 3.14;
string text = (string)pi;

Pretty simple, isn't it?  By putting the variable type in parenthesis before the variable, you can explicitly typecast most variables.  However, there are a couple of exceptions to this:
  • Rotations and Vectors cannot be cast from one to the other.
  • Keys can only be cast to strings, and only strings can be cast to keys.
  • Anything can be cast to a list, but the reverse isn't true.

As well, there is one remaining element of implicit typecasting, which also exists in Second Life: integer values may be implicitly typecast to floats, but not the reverse. So, this piece of code is valid:

//This will work under Phlox:
float number = 42;


//This won't work under Phlox:
integer number = 42.00;


Also, there are two ways to cast normal variables into lists, though the 2nd method shown below is technically creating a single-element list:

//First Method:
float number = 3.14;
list data = [number];


//Second Method:
float number = 3.14;
list data = (list)number;


This blog post is a work in progress and will be updated as the explanations are improved, as needed.  If anyone needs clarifications on parts of this or has suggestions on how to improve it, please contact me by IM at InWorldz, or post your thoughts in the comments section.

Thursday, March 17, 2011

Resistance to Change?

Well, yesterday Hamlet Au wrote a blog post that essentially stated that Second Life residents were resistant to change and because of that they were enemies to Second Life's survival.

What are my thoughts on this?  Just a two:

1) Everyone won't agree on what changes are good or bad for Second Life.  Second Life has a very diverse population.  What one person considers a great idea, others will consider a horrible idea.  Open sourcing the viewer, sculpts, meshes, and Viewer 2.0 all fall under this.

2) I've noticed that there's been a great deal of resistance to changes that Hamlet Au wants to see in secondlife, such as facebook integration and gamification.  One must wonder if this affected Hamlet's reasoning at all.

Tuesday, March 1, 2011

A Geekier Way to Set Your Default Browser in Linux

One of the things I've disliked about having programs from multiple desktops installed is that you need to change the default browser from more than one settings configuration. And then, some programs use their own browser settings, requiring you to use them. And some proprietary applications, such as skype, don't have any apparent way to change the default browser.

This can be very tedious if you like to change browsers frequently.  I tend to try out the beta builds for firefox for a few days before switching back to chromium.  So, I've put together a useful little trick for working around the problem. Here's how you do it under Arch Linux.


First, create a new text file named "browser" in your /usr/bin directory as root.  Then, put the text below into the text file:

#!/bin/bash
chromium "$@"


Replace chromium with the command for whichever browser you prefer to use.  Next, go to your /etc/profile.d folder (most systems /etc/profile) and create a new file, name browser.sh (or whatever .sh you prefer).  Set its contents to this:

#!/bin/bash
export BROWSER="/usr/bin/browser"


And then sign out from your session and reboot.  Most applications will use what you set in /usr/bin/browse as the default browser.  For the ones that don't, you manually set them in the application preferences and the preferences for GNOME and KDE.  After you've finished setting that all up, you will only need to change /usr/bin/browser in order to change the default browser.

However, the instructions above will change your default browser for the entire system for any applications that observe the BROWSER environment variable.  If you have multiple users on your system, you may wish to instead export the BROWSER environment variable in the .bash_profile or .bashrc files in your home directory.

Thursday, January 6, 2011

Strange Rendering Glitch

For almost two years, I've been seeing a bizarre rendering glitch on my system. What happens is some textures in use by the second life viewer will suddenly start showing up with strange colored rectangles on them. At first, only one or two show up, but slowly over time more and more rectangles show up, and I start to see things like this:


More rectangles continue to be added to the texture over time, and persist until I either relog, or leave the area and come back after the texture has been cleared from the graphics memory. It has occurred with every viewer for Second Life I've used, and most frequently occurs with the default plywood texture.

I also see this issue occurring on my desktop background. It will accumulate rectangles until it is refreshed. I see this occur on other apps (like firefox or chrome), and those rectangles vanish after the screen or UI is refreshed. This occurs when I am running any 3D game or application.

I first started seeing this after upgrading my video card from a Radeon HD 3750 card to an nVidia GeForce 9800 GT, so I am assuming it is either a driver issue or a configuration error. My operating system through this time has been, with the exception of a brief switch to Ubuntu, Arch Linux (originally x86_64, now i686).

My first guess is that it is a configuration issue with my system's xorg.conf, which I've posted below. And as a note, I am aware that xorg.conf is not normally required anymore. However, my video card will NOT work without one, so I require it.

If you want to see the file with the indentation (which blogger wiped out) and the syntax highlighting, I suggest viewing this link instead:
http://pastebin.com/SbyhVUGZ


# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings: version 256.44 (buildmeister@builder97.nvidia.com) Thu Jul 29 01:59:48 PDT 2010

Section "ServerLayout"
Identifier "Layout0"
Screen 0 "Screen0" 0 0
InputDevice "Keyboard0" "CoreKeyboard"
InputDevice "Mouse0" "CorePointer"
Option "Xinerama" "0"
EndSection

#Section "ServerFlags"
# Option "AutoAddDevices" "False"
# Option "AllowEmptyInput" "False"
#EndSection

Section "Files"
EndSection

Section "InputDevice"
# generated from default
Identifier "Mouse0"
Driver "mouse"
Option "Protocol" "auto"
Option "Device" "/dev/psaux"
Option "Emulate3Buttons" "no"
Option "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
# generated from default
Identifier "Keyboard0"
Driver "kbd"
EndSection

Section "Monitor"
# HorizSync source: edid, VertRefresh source: edid
Identifier "Monitor0"
VendorName "Unknown"
ModelName "DELL E171FPb"
HorizSync 31.0 - 80.0
VertRefresh 56.0 - 75.0
Option "DPMS"
EndSection

Section "Device"
Identifier "Device0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
BoardName "GeForce 9800 GT"
EndSection

Section "Screen"
Identifier "Screen0"
Device "Device0"
Monitor "Monitor0"
DefaultDepth 24
Option "CoolBits" "1"
Option "TwinView" "1"
Option "TwinViewXineramaInfoOrder" "CRT-0"
Option "AllowGLXWithComposite" "true"
Option "metamodes" "CRT: 1280x1024_60 +0+0, DFP: 1440x900_60 +1280+124"
SubSection "Display"
Depth 24
EndSubSection
EndSection

Section "Extensions"
Option "Composite" "Enable"
EndSection