:: Forum >>

Issues with setColumnIndices and getColumnCount

Hi alex,
I have one problem.
When setting the columnIndices, the grid decrease the columnCount for some reason (the visible columns changed but the amount of columns not).
Is that the correct behavior?
I think, the grid should have:
getColumnCount();
getColumnCount(boolean visible);
getVisibleColumnCount();

Or if you have other way to get the total of columns (dont care if not visible), please tell me.
tkz.
Paulo Cesar (PC from Brazil )
Monday, December 10, 2007
when you call setColumnIndices(array) the columnCount is set to the array length. If your datasource has more columns than shown in the grid - you should obtain this information from the datasource, because the grid only 'knows' about the columns which you assign with setColumnIndices() method.
Alex (ActiveWidgets)
Monday, December 10, 2007
Ok alex, but the columnIndices shouldnt be initialized?
var obj = new AW.UI.Grid;
obj.setColumnCount(4);
obj.setHeaderText("test");
obj.getColumnIndices()// length=0, maybe the initial value should be 0,1,2,3?
Paulo Cesar (PC from Brazil )
Monday, December 10, 2007
yes, that would be more logical. The current convention is that if the columnIndices property is empty then the columns are assumed to be in 'default' order, i.e. 0,1,2,3...
Alex (ActiveWidgets)
Monday, December 10, 2007

This topic is archived.


Back to support forum

Forum search