In my experience The key diferentiators for Informatica are integration with data cleansing tools (like First Logic) and the ability to scale to multi server environments in order to increase throughput.
DecisionStream has the ability to integrate with 3rd party apps, but only through job control and command line interface, or through custom development against an api (like creating a dll to reference within a function).
On the scalable architecture point...once you have done everything you can in order to improve the efficiency of your etl code and application architecture (network, SAN, database), the only thing you can do is add processors and RAM. With Informatica you can add servers.
However, DecisionStream has some novel dimensional functionality that is effective and easy to use. Managing surrogates, SCD's and late arriving facts is fantastic.
The key problem for both is that it seems to take a certain kind of person to want to dig into them and really learn how to use them. There are so few quality developers out there...anybody here an expert level DS developer?