If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Recent posts

#1
 
When choosing a new online casino, it's easy to get lost in the options. The most important thing is to look for detailed reviews and player feedback. A good review will cover everything from game selection to withdrawal times, giving you a complete picture before you commit.
 
This guide breaks down everything you need to know. You can find it here: unlock your potential.
 
Always prioritize security and fair play when choosing a place to play. It makes all the difference.
#2
Reporting / Re: Can author influence seque...
Last post by dougp - 15 Dec 2025 10:20:30 AM
QuoteI usually control the filters by writing my own SQL. Inner query and outer query, that way you can apply filters to inner query before applying to outer query
That's always a good double-check.  So if you write custom SQL for this, you get different results than what Cognos is producing?  Cognos is just generating SQL for you based on what you have created in Reporting combined with the way the model was defined.  So what's different between the SQL that Cognos is generating and the SQL that you wrote that produced the result you wanted?  Did you make a mistake in the report?  Is there an incorrect relationship in the model?  ...?


Your original question was about the order of filters.  Now it's about the order of subqueries.

That can be done by chaining multiple queries together.  Create Query1.  Use Query1 as the source for Query2.  etc.  If you're using CQM Cognos will write an old-style query using subqueries in the FROM clause.  If you are using DQM, Cognos will write exactly the same query, but using CTEs.

While one of the following queries may return faster, they will return exactly the same results.

with
Query1 as (
select col1, col2
from MySource
where col1 = 1
)
select col1, col2
from Query1
where col2 = 2
;

with
Query1 as (
select col1, col2
from MySource
where col2 = 2
)
select col1, col2
from Query1
where col1 = 1
;

select col1, col2
from MySource
where col1 = 1
  and col2 = 2
;








#3
Reporting / Re: Can author influence seque...
Last post by cognostechie - 12 Dec 2025 12:53:04 PM
Quote from: FerdH4 on 10 Dec 2025 02:58:29 PMI'm working with v11.2.3 and getting unexpected results (result set) in a query with nine Detail Filters.

I've successfully tested each filter alone and get the correct result set each time.  Even when 6, 7, or even 8 Filters are active, the result sets are correct.

But I get wrong result set when all nine Filters are active.  Records which should not be excluded are being excluded even though they were not excluded in earlier tests with fewer than 9 Filters active. 

I am using Minimum and Maximum expressions on separate fields in separate Filters.  Quite literally, it looks like if I could control the other of the execution of one or specific Filters the results might be correct.

Is there anything that I can do inside of a single Query to influence the execution order of my Filters?

I usually control the filters by writing my own SQL. Inner query and outer query, that way you can apply filters to inner query before applying to outer query
#4
AI Agents and Automation / Cognos AI Agents - Has anyone ...
Last post by DaBaker - 12 Dec 2025 11:47:17 AM
AI agents automate content validation, track unused reports, monitor performance, and even assist with migration projects.

By offloading routine work to AI driven processes, BI professionals can spend more time on strategic analysis, data storytelling, and partnering with business stakeholders. Automation is not about replacing BI teams. It is about elevating them to higher value work.

Anyone us the AI Agents built for Cognos?  How has it gone?
#5
AI Readiness for BI Teams / Anyone using AI within Cognos ...
Last post by DaBaker - 12 Dec 2025 11:45:13 AM
AI adoption does not start with models. It starts with the foundation of your BI environment. Teams preparing for AI should focus on improving data quality, tightening governance, cleaning redundant objects, and upgrading to Cognos 12 to ensure performance and security.

An AI ready environment has clear data lineage, consistent metadata, modern security, and scalable infrastructure. These steps reduce the risk of misleading predictions, increase trust, and make migration to AI enhanced tools like Watsonx BI far smoother.

Anyone have some real examples of using AI with Cognos, that isn't WxBI or what is built inside Cognos?
#6
Real Examples of Predictive Analytics Improving BI Workflows

Predictive analytics can reduce the manual work Cognos teams face daily. For example, forecasting sales no longer requires exporting data to spreadsheets and building models by hand. Watsonx BI automates this and provides confidence scoring so teams know how reliable each prediction is.

Another example is anomaly detection. Instead of waiting for a user to notice something odd in a dashboard, Watsonx BI can alert the team to spikes, drops, or unusual patterns before they impact decisions. These types of predictive enhancements help BI teams evolve into strategic partners for the business rather than report builders.
#7
Reporting / Re: lead and lag
Last post by dougp - 12 Dec 2025 11:04:08 AM
Today I noticed that Cognos Analytics 12.1.1 includes "Enhanced support of the LAG and LEAD functions" ( https://www.ibm.com/docs/en/cognos-analytics/12.1.x?topic=components-enhanced-support-lag-lead-functions ).  But what do they mean by "Enhanced"?  Does that mean it's already there?

I see in 12.1.1, on the Functions tab in the FM expression editor, lag and lead are included under Summaries right after count.  But in 12.0.4, I don't see it listed in either Reporting or in FM.  But...

I created a simple query against linear reference data and included

lag([Item to Report], 1, null for [Grouping Item] order by [Sorting Item])
...and it works.

The Information box contains this for lag:

Quotelag ( member , index_expression )
Returns the sibling member that is "index_expression" number of positions prior to "member".

Example: lag ( [Tents] , 1 )
Result: Cooking Gear

Example: lag ( [Tents] , -2 )
Result: Packs

I'm guessing that description is appropriate when using dimensional models.  I use only relational models.

Back to the 12.1.1 docs...  Take a look at the images.  You'll need to zoom your browser window a bunch because the image is horrible.  (Even after zooming, I couldn't get an OCR to convert it to text.)  But IBM provided a really helpful example for this function.  That's a vast improvement over examples for other functions, like count...

Quotecount ( [ all | distinct ] expression [ auto ] )
count ( [ all | distinct ] expression for [ all|any ] expression { , expression } )
count ( [ all | distinct ] expression for report )
Returns the number of selected data items excluding null values. Distinct is an alternative expression that is compatible with earlier versions of the product. All is supported in DQM mode only and it avoids the presumption of double counting a data item of a dimension table.

Example: count ( Sales )
Result: Returns the total number of entries under Sales.

...where they provide the documentation but the example provides zero insight into how the options affect the output.

I'm already using 12.0.4, but for the benefit of others:  Does anyone here know what version introduced lag and lead?

#8
Dashboards / Re: Dashboard performs less on...
Last post by bus_pass_man - 11 Dec 2025 05:59:17 PM
As I mentioned earlier in this thread, you deal with many to many relationships with a bridge table.   As I also mentioned, I don't think this is a legitimate bridge table situation, which was my attempt to not be my usual blunt self but rather gently suggest there are serious modelling problems.   





#9
Dashboards / Re: Dashboard performs less on...
Last post by dougp - 11 Dec 2025 01:22:14 PM
This could be old information, but setting cardinality in Cognos simply tells Cognos where to expect the measures.  N:N says there are measures on both sides of the join.

N:N seems wrong for two reasons:
It seems unlikely that a Date table would have measures.
It seems unlikely that one Fact row would be associated with many Date rows.

But the cardinality can't be causing your data explosion.  It seems like you've taken a table with 130,000 rows and joined it to a table with about 1300 rows on 1=1 rather than on something like Fact.OrderDate = DateTable.FullDate.

Also, if your data module joins the tables
Quotein the exact same way
as in the dataset, you would get exactly the same results.

Next time you ask for more information about this, please post your data model clearly showing the relationships between all elements and highlight or describe the relationship that you think is causing the problem.
#10
Dashboards / Re: Dashboard performs less on...
Last post by moos_93 - 11 Dec 2025 01:14:46 AM
Quote from: dougp on 09 Dec 2025 09:54:38 AMAfter rereading this, I think this is the symptom to focus on.

By any chance, does your date dimension have about 1300 rows?  About 3.6 years?  Of course, working with non-rounded numbers to begin with would help.

I think what you are meaning by N:N is not even that the date table and the fact have a many-to-many relationship.  I think you're saying they have no relationship -- a cross join or cartesian join.  So the result is a dataset with the number of rows being the number or rows from fact times the number of rows from the date dimension.

Creating a proper relationship between the tables will help.  The result should be the number of rows on the fact.  Once you have that, you can start trying to use the dataset to answer questions.

Hey, thanks for your help! I have modelled the tables in a datamodule, using a N:N relationship between FACT_Trajectories and DIM_Dates. Simply plotting FACT_Trajectories on a graph works smoothly. However, when combining it with the DIM_Dates, I get a time out error.

I do not actually combine the two tables into one single table if that is what you mean. However, to plot de data with the two tables combined, I suspect something similar to what a combined table would look like is created in the background. Or is that assumption wrong?