Although tangent space normal mapping is used for a while in games now. There is still often one major flaw left in the asset pipeline:
Unsychronized Tangent Space (TS)
While TS as such is defined mathematically, and most people end up using similar (but not necessary same definition), it is a per-triangle feature. Therefore,the actual per-vertex storage can vary as well. There is different ways to smooth the vectors to a per-vertex attribute(just like vertex-normal smoothing sometimes may break geometric vertices open for hard-edges). Furthermore there is some typical optimization for actual display, such as reconstructing one of the vectors as cross product from the others, or avoiding per-pixel normalization of the matrix.
Major applications such as 3dsmax have suffered from this problem in past versions, the realtime display was not matched to the baker (only the offline renderer was perfect). Some developers such as id software had tools for this in the doom3 days, or CryTek (who document their tangent-space math quite well on the web). For a lot of other, even big players, there is no public information on the tangent space used in rendering.
This mismatch of "encoder/decoder" costs money. Artists spend extra time fixing interpolation issues, adding geometry, tweaking UV layouts... to get visual acceptable results. And yet often their preview (e.g. inside modeller) might still be "off" in the end (but close enough). As coder I might think "I know my math" but am unaware of the different baking tools and import/export issues. As artist I work with what I was given and am used to "work with limitations". This causes unnecessary frustration and can lead to dispute, if "one" side actually knows better.
And knowing better should be no problem today. Popular baking tools, such as xnormal, allow custom tangent space definitions. I've worked on enhancing the 3dsmax pipeline myself. The 3pointshader fixed the mismatch in old max versions, simply by encoding the "correct" tangent-space (synced to 3dsmax's default baker) as 3 UVW-channels. That way the realtime shader was matched to the baker. Accesing UVW data is also not too hard for import/export. Furthermore 3dsmax allows modifying the bake process through a plugin, and one could use this to use the same UVW-channel trick, or disable per-pixel normalization during baking (sample project with sources here)
So please for the sake of saving time (and money) and billions of "my normalmap looks wrong" worries by artists, all sides spend one day to talk it through, educate the artists what "bakers" they can use, educate the coders that their TS choice (all the nitty gritty details) matters for the asset pipeline. It might not have mattered in the bump-map days or when testing simple geometry, but once you bake complex high-res to low-res it does!