The point is using or not using the tools *available*. I am business, I am not IT. As in, I am a report designer slash analyst with access to packages published by IT. If my org's IT only publishes certain data in a relational package rather than dimensional, there's nothing I can do about it except make do with what I got. And thus, from the limited resources I have available, I need to choose the best tool available for the job.
/Edit: Also, wha makes you think you know that an OLAP cube would have performed any better in this particular task? The report encompasses a dynamic calculation actually on row level of the DB which, as both user inputs via prompts and Facts from row level go into it, kills each and any performance advantage a precalculated and -aggregated cube would have.
This might be a 'point' for you but to me , it sounds like a limitation and that being the case, your judgement of tools is limited by your exposure and hence likely to be wrong. I have worked for companies where the reports had lot of queries with joins and unions taking 4-5 minutes to run and in some cases just timing out. I re-wrote those reports with much less queries and it ran in less than 30 seconds. In all those cases, they were blaming Report Studio until they got the new report. Applying a calculation to each row of data does not cause any performance issues and you can test it by making a simple report with one query and two columns. You won't see any performance issues. The problem arises because of too many queries made poorly. Another reason for making reports like that is the poor designing of the package. When the package supports only simple reporting then complexities have to embedded in the report.
I do understand your problem that you have to live with what has been given to you but for the sake of enlightenment , there are tools to do exactly what you need. TM1 is designed to take inputs from the user (what-if analysis). If you do not have licenses for that, you can use Dynamic Cubes which would also do this. Cubes do have pre-aggregated measures but also allow calculations on the fly over the entire data set.
Your issue is a very standard issue, dealt by many people many times but you think it is a huge challenge simply because of the exposure. I will give you an example. I have a package which allows the users to select the currency of their choice and it converts the value of all revenue metrics to that currency meaning it applies calculations to all of them, not just one, at the row level ! We haven't seen any degradation in performance since implementing this and we do have operational reports as well as summarized reports which allow drill thru to detailed data.
By the way, your problem is not just yours. Lot of other companies have the same problem and that's exactly why Gartner has suggested that the BI team should be working for the business instead of IT

It's the 'non-technical' issues which causes the problem.