Skip to main content

Is it this a BUG or a videocard mis-feature using OpenGL?

No replies
fravalle
Offline
Joined: 2006-11-17
Points: 0

I am currently using an onboard ATI Radeon on Windows, and via -Dj3d.rend I am using the OGL driver.

1) I draw a grid composed by thin lines, all LineArray, and then I add a LinearFog (ExponentialFog has the same behavior) node.

2) I use OrbitBehavior to navigate the scene, but pratically only when I watch straight x and z axis I see the grid lines.

But using D3D driver, all lines are correctly drawn.

The FOG
[pre]
LinearFog fog = new LinearFog();
fog.setName("FOG-LIN:"+groupScope.getName());
fog.setColor(new Color3f(BCK_COLOR));
fog.setFrontDistance(.5D);
fog.setBackDistance(27D);
fog.setCapability(LinearFog.ALLOW_COLOR_WRITE);
fog.setCapability(LinearFog.ALLOW_DISTANCE_WRITE);

fog.setInfluencingBounds(this.worldBounds);
fog.addScope(groupScope);
this.getBranchRoot().addChild(fog);
[/pre]

The GRID
[pre]
static Shape3D createLand(Color col) {
Color3f cf = new Color3f(col);
// Color4f cf = new Color4f(col);
// cf.setW(0);

int grid=50;
int factor = grid*2;
int size = 40*factor;
LineArray landGeom = new LineArray(size, GeometryArray.COORDINATES | GeometryArray.COLOR_3);
final float len = grid*10;
float newLen = -len;
for (int c = 0; c < size; c += 4) {
landGeom.setCoordinate(c + 0, new Point3f(-len, 0.0f, newLen));
landGeom.setCoordinate(c + 1, new Point3f(len, 0.0f, newLen));
landGeom.setCoordinate(c + 2, new Point3f(newLen, 0.0f, -len));
landGeom.setCoordinate(c + 3, new Point3f(newLen, 0.0f, len));
newLen += 1f;
}

for (int i = 0; i < size; i++)
landGeom.setColor(i, cf);

return new Shape3D(landGeom);
}

[/pre]

So what is wrong?

Some screenshots:

D3D straight:

D3D angled:

OGL straight (disappearing some lines):

OGL angled (disappearing all lines):

Message was edited by: fravalle

Message was edited by: fravalle

Message was edited by: fravalle

Message was edited by: fravalle

Message was edited by: fravalle