Thursday, January 16, 2014

Alles Im Fluss - open beta :)

For the start of the new year I am happy to announce that a long time project (dating back to 2008) finally is ready for public. "Alles Im Fluss" (everything flows), a 3dsmax plugin to aid poly modelling is available for open beta.

Alles Im Fluss provides the ability to quickly and easily draw polygon strips, connections or extrusions, and cap holes while maintaining clean, mostly quad-based topology.

One single tool provides you with all functionality depending on the sub-object type you are in, or keyboard modifiers used.

The tool provides you with control to refine the surface flow of connections or caps and replay drawn paths on other geometry.

Head over to and grab a copy for evaluation (fully-featured)!
Pricing is yet to be determined, however, you can expect it to be the cost of one game.

Hope to be able to update it for a bit, it's a nice "topic change" from my regular job around graphics programming at NVIDIA, back to the artist roots. Next goal is to be able to "pick up" paths from existing geometry and then replay.

Saturday, March 30, 2013

Simple GLSL compilation checker

As NVIDIA's Cgc is getting kinda dated (it is able to compile GLSL as well), threw together a simple commandline tool for basic offline compilation of GLSL shaders. Find it at GitHub repository

Sunday, June 3, 2012

tangent space can cost extra money

Although tangent space normal mapping is used for a while in games now. There is still often one major flaw left in the asset pipeline: Unsychronized Tangent Space (TS)

While TS as such is defined mathematically, and most people end up using similar (but not necessary same definition), it is a per-triangle feature. Therefore,the actual per-vertex storage can vary as well. There is different ways to smooth the vectors to a per-vertex attribute(just like vertex-normal smoothing sometimes may break geometric vertices open for hard-edges). Furthermore there is some typical optimization for actual display, such as reconstructing one of the vectors as cross product from the others, or avoiding per-pixel normalization of the matrix.

Major applications such as 3dsmax have suffered from this problem in past versions, the realtime display was not matched to the baker (only the offline renderer was perfect). Some developers such as id software had tools for this in the doom3 days, or CryTek (who document their tangent-space math quite well on the web). For a lot of other, even big players, there is no public information on the tangent space used in rendering.

This mismatch of "encoder/decoder" costs money. Artists spend extra time fixing interpolation issues, adding geometry, tweaking UV layouts... to get visual acceptable results. And yet often their preview (e.g. inside modeller) might still be "off" in the end (but close enough). As coder I might think "I know my math" but am unaware of the different baking tools and import/export issues. As artist I work with what I was given and am used to "work with limitations". This causes unnecessary frustration and can lead to dispute, if "one" side actually knows better.

And knowing better should be no problem today. Popular baking tools, such as xnormal, allow custom tangent space definitions. I've worked on enhancing the 3dsmax pipeline myself. The 3pointshader fixed the mismatch in old max versions, simply by encoding the "correct" tangent-space (synced to 3dsmax's default baker) as 3 UVW-channels. That way the realtime shader was matched to the baker. Accesing UVW data is also not too hard for import/export. Furthermore 3dsmax allows modifying the bake process through a plugin, and one could use this to use the same UVW-channel trick, or disable per-pixel normalization during baking (sample project with sources here)

So please for the sake of saving time (and money) and billions of "my normalmap looks wrong" worries by artists, all sides spend one day to talk it through, educate the artists what "bakers" they can use, educate the coders that their TS choice (all the nitty gritty details) matters for the asset pipeline. It might not have mattered in the bump-map days or when testing simple geometry, but once you bake complex high-res to low-res it does!

Friday, September 9, 2011

mini lua primer

-- tables are general container can be indexed by anything (numbers, 
-- tables, strings, functions...)

local function blah() end
tab[blah] = blubb  = blubb -- is same as
tab["name"] = blubb

-- tables are always passed as "pointers/references" never copied
-- array index starts with 1 !!
-- they become garbage collected when not referenced anymore

pos = {1,2,3}
a = { pos = pos }
pos[3] = 4
pos = {1,1,1} -- overwrites local variable pos
a.pos[3] -- is still 4

--[[ multiline 
comment ]]

blah = [==[ multiline string and comment 
can use multiple = for bracketing to nest  ]==]

--- multiple return values allow easy swapping
a,b = b,a

-- object oriented stuff
-- : operate passes first arg
a.func(a,blah) -- is same as 

-- metatables allow to index class tables
myclass = {}
myclassmeta = {__index = myclass}
function myclass:func() 
  self --automatic variable through : definiton for the first 
       --arg passed to func
-- above is equivalent to 
myclass.func = function (self) 


object = {}
object:func() -- is now same as

-- until func gets specialized per object
function object:func()
  -- lua will look up first in the object table, then in the metatable
  -- it will ony write to the object table

--- upvalues for function specialization

function func(obj)
  return function ()
    return obj * 2


a = func(1)
b = func(2)

a() -- returns 2
b() -- returns 4

--- non passed function arguments become nil automatically
function func (a,b)
  return a,b
a,b = func(1) -- b is "nil"

--- variable args
function func(...)
  local a,b = ...
  --- a,b would be first two args
  --- you can also put args in a table
  local t = {...}

--- conditional assignment chaining
--- 0 is not "false", only "false" or "nil" are

a = 0
b = a or 1 -- b is 0, if a was false/nil it would be 1

c = (a == 0) and b or 2 -- c is 0 (b's value)

-- the first time a value is "valid" (non-false/nil) that value is taken
-- that way you can do default values

function func(a,b)
  a = a or 1
  b = b or 1

--- sandboxing

function sandboxedfunc()
  -- after setfenv below we can only call what is enabled in the enviroment
  -- so in the example below doing stuff like wouldn't work here
  -- blubb becomes created in the current enviornment
  blubb = doit()

local enva = { 
  doit = function () return 1 end

local envb = { 
  doit = function () return 2 end

--enva.blubb is now 1

--envb.blubb is now 2

-- sandboxedfunc could also come from a file, which makes creating fileformats
-- quite easy, as they can internally be lua code

--- functions without () and function chaining

-- to make ini/config files quite easy, lua allows omitting () for function 
-- calls when the argument is either a string or a table

function testfunc( a )


-- valid calls to above function
testfunc "blah"
testfunc {1,2,3}

-- we can even expand this to create fileformat like structures

function group( name)
  return function (content)
    local grp = {
      name = name,
      content = content,
    return grp

local grp = group "test" {1,3,5}
-- equvialent to: group("test")({1,3,5})
-- grp.content[2] = would be 3

-- could also build a hierarchy
local grp = group "root" {
  group "child a" {},
  group "child b" {},

-- grp.content[1].name would be "child a"

Saturday, January 1, 2011

estrela as shader editor

Recently doing more work with Lua and Cg/GLSL again, hence added a couple features to estrela editor.

Lua wise I had added some experimental type-guessing mostly meant to aid auto-completion for luxinia classes. Also the lua-apis that get loaded can now be specified by interpreter, so that no luxinia functions get suggested when you are using a "normal" lua interpreter. Getting useful auto-completion and api help is still a big task so. Especially getting user created functions/classes in somehow would be great. Maybe a static tool that generates files from a lua project or so.

Most problems with "dynamic" text analysis was that when the user edits old stuff, you have to also somehow check whether keywords were changed, added, removed... hence I kinda avoid that complexity yet. I'd rather prefer a static solution that the user triggers, that way it's hopefully simpler and more robust.

Another focus lately was the Cg tool. I've added support for nvShaderPerf and an ARB/NV program beautifier (indenting branches/flow, and inserting comments as to which constants map to what variable). That makes it a bit easier to see what stuff triggers branching and so on.
I've also added automatic setting of GLSL input flag for cgc and some automatic defines such as "_VERTEX_"... so that one can use #ifdef _VERTEX_ and still have all GLSL shader code in one file. A GLSL spec and api description is now also part of estrela. I took the nice opengl 4.1 quick reference card as base. So much for now.

Still haven't found time to push the open-sourcing of luxinia further and add GLSL shader management (but will require ARB_separate_shader) to it for PhD work. But anyway new year now ;)

Monday, July 5, 2010

3point shader

So long no updates, well mostly I am still working on PhD stuff. Finally the publication on smartvisibility rendering techniques for medical datasets is out
And I am mostly working on a CUDA port of vessel histogramm analysis and a bigger system on coronary heart vessel exploration.

Furthermore, the 3point shader is also released (both commercial and non-commercial free edition). The free edition uses the same plugin and shader, but doesn't have the convenient and time-saving ui, as well as no sample files... that said if you want to play with it, you can do so for free.

A major contribution in this work is the fix of 3dsmax's broken tangentspace normalmap display in the realtime viewport. They don't send the same tangent space as they do when the scanline baker generates the normalmap. Autodesk was made aware of this problem. The cool thing is that as it simply is a viewport fix, one can get great quality improvement out of all standard bakes. And people no longer have to waste additional geometry to fix the "smoothing" issues the broken viewport had.
Thing is that many game companies have "taken" the broken viewport for the "correct" tangentspace, which it simply isn't, as Autodesk has several inconsistencies within max SDK for exposing the tangentspace. If you are interested in the fix or how to use it in game engines, you can contact 3pointstudios about it.

Another addition to the plugin/shader is mirroring support for object-space normalmaps. I have experimented with that for quite a bit, as well as transforming object-space to tangentspace in offline tools to allow exchanging the baker. Anyway the plugin generates per-vertex reflection vectors for the os-normalmaps, as long as you offset mirrored uv parts by a multiple of 1 (which you would anyway to prevent baking overlaps).

Saturday, November 14, 2009

function call highlighting

I am quite a visual assist addict, and miss some of its features in other IDEs. For luxinia's Lua and Cg use, I tweak the estrela editor to my own needs. wxWidgets's scintilla version, doesn't allow you to use the style-bits as flexible as I'd love to do. As a result the lexer overwrites the manual changes one does. But with the indicators at least, you can make sure they aren't modified. So the latest addition is function-call highlighting, something I really like in VA.

As you might see on the text I am also working on a Lua binding for OpenCL. Whilst I've used manual bindings before, this time I used swig. It needed a few "dirty" hacks and a swigutility library, but now it more or less works fine. Binding, sources and samples will come with a future luxinia release.