BW on HANA - Inventory Cube Optimized Cube issue
Hi All,
We have an inventory cube which works perfectly fine and we created a HANA optimized copy of the same cube. We sourced the Initialization from the original (non-HANA) cube, loaded data, and then we loaded the rest of the data from the corresponding DSOs (one for 2LIS_03_BF and one for 2LIS_03_UM).
In order to reconcile them, I opened both of them in BEx Analyzer. The original works perfectly fine, however the HANA optimized one, give me a error when entering the Fiscal Variant:
Error in BW: Time is not consistent.
Error reading the data of InfoProvider
Error while reading data; navigation is possible
I checked in RSDV and the validity dates are fine. When I check in RSRV I get the following errors when running "Test the partitioning column for InfoCube" and the "Consistency of time dimension for InfoCube". The return issues. If I try to repair the issues, I get the same error message.
An exception with the type CX_SY_DYNAMIC_OSQL_SEMA
Message no. RSRV000
Any clues?
Hi Busy waiting,
Please see if the following notes help you to solve the issue :
1999013 - RSRV - Initial key Figure Units in Fact Tables test results in exception
1953650 - RSRV - Fact tables and dimension tables of InfoCube results in exception error
BR
Similar Messages
-
Cube Refresh Performance Issue
We are facing a strange performance issue related to cube refresh. The cube which used to take 1 hr to refresh is taking around 3.5 to 4 hr without any change in the environment. Also, the data that it processes is almost the same before and now. Only these cube out of all the other cubes in the workspace is suffering the performance issue over a period of time.
Details of the cube:
This cube has 7 dimensions and 11 measures (mix of sum and avg as aggregation algorithm). No compression. Cube is partioned (48 partitions). Main source of the data is a materialized view which is partitioned in the same way as the cube.
Data Volume: 2480261 records in the source to be processed daily (almost evenly distributed across the partition)
Cube is refreshed with the below script
DBMS_CUBE.BUILD(<<cube_name>>,'SS',true,5,false,true,false);
Has anyone faced similar issue? Please can advise on what might be the cause for the performance degradation.
Environment - Oracle Database 11g Enterprise Edition Release 11.2.0.3.0
AWM - awm11.2.0.2.0ATake a look at DBMS_CUBE.BUILD documentation at http://download.oracle.com/docs/cd/E11882_01/appdev.112/e16760/d_cube.htm#ARPLS218 and DBMS_CUBE_LOG documentation at http://download.oracle.com/docs/cd/E11882_01/appdev.112/e16760/d_cube_log.htm#ARPLS72789
You can also search this forum for more questions/examples about DBMS_CUBE.BUILD
David Greenfield has covered many Cube loading topics in the past on this forum.
Mapping to Relational tables
Re: Enabling materialized view for fast refresh method
DBMS CUBE BUILD
CUBE_DFLT_PARTITION_LEVEL in 11g?
Reclaiming space in OLAP 11.1.0.7
Re: During a cube build how do I use an IN list for dimension hierarchy?
. -
Our situation is that we have our current situation - a long running weekly process - and a project that is in progress.
We are experiencing a long running load from 1 cue to another. I am looking for suggestions on decreasing this run time. Here are the details of the situation and what we have done to date.
We are on BW 3.5.
Our current long running process:
Cube A has approximately 258,000,000 records right now. (yes, we know its a lot, but we arent ready to redesign just yet we want to exhaust all possibilities before redesigning). It is dimensioned by material, plant, Sales Organization, and Customer, and Calendar Week.
Cube A is loaded daily with a delta from ODSs on APO. It contains forecasts for 2007, 2008, 2009.
Weekly, Cube A is loaded into Cube B. Cube B is dimensioned by material, plant, Sales Organization, and Customer, and Calendar Week. The data is the snapshot of forecasts for the current month + future 19 months.
The load from Cube A to Cube B takes 16 hours. The index to Cube B is deleted, Cube B is loaded from Cube A, the Cube B index is generated and overlapping requests are deleted. The load from Cube A to Cube B runs approximately 15 hours and is slowly increasing. Approximately 54,000,000 records get loaded from Cube A to Cube B.
Our Project:
Cube C will replace Cube B. Weekly, Cube A will load into Cube C. The grain of Cube C is higher than that of Cube B. Cube C is dimensioned by material, Sales Organization, and Customer, and Calendar Month. The data is the snapshot of forecasts for the current month + future 24 months. Also, when loading Cube C, in the start routine, we go and look up the price of a material in an ODS and then in the update rules, calculate a forecast sales amount by multiplying the price by the forecast cases. This extra step was not thought to have much of an effect on the processing.
Because our QA environment had limitations, we moved the new cube, Cube C, into production. When we attempted to load Cube A to Cube C for 0FISCPER = 2007011 2009011, it failed due to temp space. ORA-01652: unable to extend temp segment by 2560 in tablespace PSAPTEMP. Our DBA suggested we chunk our data loads. That was a week ago Sunday. Since then we have done all of the following:
1. Data package size was 50,000. Selection FISCPER = 2007011 2008001 from Cube A to Cube C. Results: Manually cancelled after 1 day and 3 hours.
2. Changed data package size to 20,000. Selection FISCPER = 2007011 2008001 from Cube A to Cube C. We optimized our code Code optimized to load data package into internal table to get unique materials before reading ZMATERIAL table. Results: Completed in 7h 1m 29s. So for 3 months it took 7 hours. We need to load 24 months so estimated that would take almost 60 hours .not acceptable.
3. BASIS changed GLOBAL PARAMETER Maximum number of dialog processes for sending data changed from 3 to 5
4. We created Secondary indexes created for /BIC/PZMATERIAL and ZFSOPO51 on Material. Data package was at 20,000. Selection FISCPER = 2008002 2008003. Results: Completed in 5h 26m 6s. Still too long.
5. Loaded another 2 months and it completed successfully in a little over 5 hours.
6. We met with BASIS, DBA and Network/Disk team. Our conversations with them had us investigating the disk layout, spoke of LUNS, hot spots, Tier 2 all things that we thought would lead us to rearrange our disk.
7. We reviewed graphs showing I/O and CPU usage and compared them to our runs. We saw that that on the selection from Cube A our I/O spiked. Once the data was all selected, I/O dropped and it appeared that CPU was hit heavy then.
8. Conversations continued much of which was over my developer head
a. There appears that there is 9GB of unallocated memory. Theres action to split the 9GB of unallocated memory between the application (BW) and the database (Oracle).
b. Review the performance of the HBA
c. Has the Oracle systems processes been disabled in BWP? If not, disable that process.
9. We reviewed the index on Cube A and determined that it wasnt being used. We implemented OSS Note 561961 to never USE_FACTVIEW. That decreased the selection time on Cube A when we tested on our QA environment from 60 minutes to 15 minutes. However, the whole infopackage ended in the same time as when we hadnt known about the OSS Note. Selection FISCPER = 2007012
My next step is to build an aggregate off of Cube A that looks like the grain of Cube C and kick off a load of 1 month again - Selection FISCPER = 2007012
I will then compare this time against the other runs. If it is improved, Id like to run several infopackages at the same time, using different FISCPER Selections in each.
Other conversations/suggestions weve had was about compressing Cube A.
Weve also reviewed our weekly generated Service Reports. Nothing obvious jumps out at me there. I have also reviewed many of the performsnce presenatations on SDN and see that the CPUs usage can be reviewed, too. At this point I don't have access to some of the transactions.
Thanks you for reading and please repsond with your advice. I appreciate it.Hi mary,
Check whether the Cube partition as per calday and Compress the cube.
And you can create Aggregates for the cube.
Let us know the status for the further info.
Reg
Pra -
Data load taking very long time between cube to cube
Hi
In our system the data loading is taking very long time between cube to cube using DTP in BI7.0,
the maximum time consumption is happening at start of extraction step only, can anybody help in decreasing the start of extraction timing please
Thanks
KiranKindly little bit Elaborate your issue, Like how is mapping between two cubes, Is it One to one mapping or any Routine is there in Transformation. Any Filter/ Routine in DTP. Also before loading data to Cube did you deleted Inedxes?
Regards,
Sushant -
Is there a way to change the LUt extension form .CUBE to .cube. on export with this script in photoshop For MAC:
Chris Cox wrote:
The file extensions are written by the export plugin. (which incidentally has a comment that two studios wanted all caps extensions, but I failed to write down which ones in the comments)
To change the filenames, you'd want to add something at the end of doExportLUTs() that uses the supplied path and substitutes the desired extensions, then renames the file.
Thank you
// Export Color Lookup Tables automation in JavaScript
// IN_PROGRESS - why can't ColorSync Utility open any profile with a grid of 160 or larger?
// 150 works, 160 fails -- sent samples in email to Apple on Nov 8, 2013; they are investigating
// DEFERRED - right to left filenames (Arabic) come out wrong because of appending "RGB" and file extensions
// This seems to be a bug in JavaScript's handing of strings, not sure we can solve it easily.
// It might possibly be handled by checking bidi markers in UTF8 stream and adding custom handling for appending text/extensions.
@@@BUILDINFO@@@ ExportColorLookupTables.jsx 1.0.0.0
// BEGIN__HARVEST_EXCEPTION_ZSTRING
<javascriptresource>
<name>$$$/JavaScripts/ExportColorLookupTables/Menu=Color Lookup Tables...</name>
<menu>export</menu>
<enableinfo>true</enableinfo>
<eventid>9AA9D7D6-C209-494A-CC01-4E7D926DA642</eventid>
</javascriptresource>
// END__HARVEST_EXCEPTION_ZSTRING
#target photoshop
const appUIState = app.displayDialogs;
app.displayDialogs = DialogModes.NO; // suppress all app dialogs
app.bringToFront(); // make Photoshop the frontmost app, just in case
// on localized builds we pull the $$$/Strings from a .dat file
$.localize = true;
// from Terminology.jsx
const classApplication = app.charIDToTypeID('capp');
const classProperty = app.charIDToTypeID('Prpr');
const enumTarget = app.charIDToTypeID('Trgt');
const eventGet = app.charIDToTypeID('getd');
const eventSet = app.charIDToTypeID('setd');
const kcolorSettingsStr = app.stringIDToTypeID("colorSettings");
const kDither = app.charIDToTypeID('Dthr');
const keyTo = app.charIDToTypeID('T ');
const typeNULL = app.charIDToTypeID('null');
const typeOrdinal = app.charIDToTypeID('Ordn');
const kFloatWindowStr = app.stringIDToTypeID("floatWindow");
const typePurgeItem = app.charIDToTypeID('PrgI');
const enumClipboard = app.charIDToTypeID('Clpb');
const eventPurge = app.charIDToTypeID('Prge');
const keyExportLUT = app.charIDToTypeID( "lut " );
const keyFilePath = app.charIDToTypeID( 'fpth' );
const keyDescription = app.charIDToTypeID( 'dscr' );
const keyCopyright = app.charIDToTypeID( 'Cpyr' );
const keyDataPoints = app.charIDToTypeID( 'gPts' );
const keyWriteICC = app.charIDToTypeID( 'wICC' );
const keyWrite3DL = app.charIDToTypeID( 'w3DL' );
const keyWriteCUBE = app.charIDToTypeID( 'wCUB' );
const keyWriteCSP = app.charIDToTypeID( 'wCSP' );
const kScriptOptionsKey = "9AA9D7D6-C209-494A-CC01-4E7D926DA642"; // same as eventID above
const sGridMin = 7; // these must match the slider range defined in the dialog layout
const sGridMax = 256;
const sGridDefault = 32;
// our baseline UI configuration info
var gSaveFilePath = ""; // overwritten by document path
var gDescription = ""; // overwritten by document name
var gCopyright = ""; // "Adobe Systems Inc., All Rights Reserved";
var gGridPoints = sGridDefault;
var gDoSaveICCProfile = true;
var gDoSave3DL = true;
var gDoSaveCUBE = true;
var gDoSaveCSP = true;
gScriptResult = undefined;
// start doing the work...
main();
app.displayDialogs = appUIState; // restore original dialog state
gScriptResult; // must be the last thing - this is returned as the result of the script
function readOptionsFromDescriptor( d )
if (!d)
return;
if (d.hasKey(keyFilePath))
gSaveFilePath = d.getString( keyFilePath ); // will be overridden by UI
if (d.hasKey(keyDescription))
gDescription = d.getString( keyDescription ); // will be overridden always
if (d.hasKey(keyCopyright))
gCopyright = d.getString( keyCopyright );
if (d.hasKey(keyDataPoints))
var temp = d.getInteger( keyDataPoints );
if (temp >= sGridMin && temp <= sGridMax)
gGridPoints = temp;
if (d.hasKey(keyWriteICC))
gDoSaveICCProfile = d.getBoolean( keyWriteICC );
if (d.hasKey(keyWrite3DL))
gDoSave3DL = d.getBoolean( keyWrite3DL );
if (d.hasKey(keyWriteCUBE))
gDoSaveCUBE = d.getBoolean( keyWriteCUBE );
if (d.hasKey(keyWriteCSP))
gDoSaveCSP = d.getBoolean( keyWriteCSP );
function createDescriptorFromOptions()
var desc = new ActionDescriptor();
desc.putString( keyFilePath, gSaveFilePath ); // will be overridden by UI
desc.putString( keyDescription, gDescription ); // will always be overridden by document name
desc.putString( keyCopyright, gCopyright );
desc.putInteger( keyDataPoints, gGridPoints );
desc.putBoolean( keyWriteICC, gDoSaveICCProfile );
desc.putBoolean( keyWrite3DL, gDoSave3DL );
desc.putBoolean( keyWriteCUBE, gDoSaveCUBE );
desc.putBoolean( keyWriteCSP, gDoSaveCSP );
return desc;
function doExportUI()
// DEFERRED - it might be nice to be able to run without UI
// Right now we can't, but someone could modify the script if they so desire
const sDescription = localize("$$$/AdobeScript/Export3DLUT/Description=Description:");
const sCopyright = localize("$$$/AdobeScript/Export3DLUT/Copyright=Copyright:");
const sQuality = localize("$$$/AdobeScript/Export3DLUT/Quality=Quality");
const sGridPoints = localize("$$$/AdobeScript/Export3DLUT/GridPoints=Grid Points:");
const sFormatsToSave = localize("$$$/AdobeScript/Export3DLUT/Formats=Formats");
const sOpenButton = localize("$$$/JavaScripts/psx/OK=OK");
const sCancelButton = localize("$$$/JavaScripts/psx/Cancel=Cancel");
const strTextInvalidType = localize("$$$/JavaScripts/Export3DLUT/InvalidType=Invalid numeric value. Default value inserted.");
const strTextInvalidNum = localize("$$$/JavaScripts/Export3DLUT/InvalidNum=A number between 7 and 256 is required. Closest value inserted.");
const strNoExportsSelected = localize("$$$/JavaScripts/Export3DLUT/NoExportTypesSelected=No export types were selected.");
const strExportPrompt = localize("$$$/JavaScripts/Export3DLUT/ExportColorLookup=Export Color Lookup");
const strUntitledLUT = localize("$$$/JavaScripts/Export3DLUT/UntitledLUTFilename=untitled.lut");
const sSaveICC = localize("$$$/AdobeScript/Export3DLUT/ICCProfile=ICC Profile");
// these are not localized, since they refer to file format extensions
const sSave3DL = "3DL";
const sSaveCUBE = "CUBE";
const sSaveCSP = "CSP";
// strings similar to JPEG quality
const sPoor = localize("$$$/AdobeScript/Export3DLUT/Poor=Poor");
const sLow = localize("$$$/AdobeScript/Export3DLUT/Low=Low");
const sMedium = localize("$$$/AdobeScript/Export3DLUT/Medium=Medium");
const sHigh = localize("$$$/AdobeScript/Export3DLUT/High=High");
const sMaximum = localize("$$$/AdobeScript/Export3DLUT/Maximum=Maximum");
const ui = // dialog resource object
"dialog { \
orientation: 'row', \
gp: Group { \
orientation: 'column', alignment: 'fill', alignChildren: 'fill', \
description: Group { \
orientation: 'row', alignment: 'fill', alignChildren: 'fill', \
st: StaticText { text:'Description:' }, \
et: EditText { characters: 30, properties:{multiline:false}, text:'<your description here>' } \
copyright: Group { \
orientation: 'row', alignment: 'fill', alignChildren: 'fill', \
st: StaticText { text:'Copyright:' }, \
et: EditText { characters: 30, properties:{multiline:false}, text:'<your copyright here>' } \
qual: Panel { \
text: 'Quality', \
orientation: 'column', alignment: 'fill', alignChildren: 'fill', \
g2: Group { \
st: StaticText { text:'Grid Points:' }, \
et: EditText { characters:4, justify:'right' } \
drp: DropDownList {alignment:'right'} \
sl: Slider { minvalue:7, maxvalue:256, value: 32 }, \
options: Panel { \
text: 'Formats', \
orientation: 'column', alignment: 'fill', alignChildren: 'left', \
ck3DL: Checkbox { text:'3DL', value:true }, \
ckCUBE: Checkbox { text:'CUBE', value:true } \
ckCSP: Checkbox { text:'CSP', value:true } \
ckICC: Checkbox { text:'ICC Profile', value:true } \
gButtons: Group { \
orientation: 'column', alignment: 'top', alignChildren: 'fill', \
okBtn: Button { text:'Ok', properties:{name:'ok'} }, \
cancelBtn: Button { text:'Cancel', properties:{name:'cancel'} } \
const titleStr = localize("$$$/AdobeScript/Export3DLUT/DialogTitle/ExportColorLookupTables=Export Color Lookup Tables");
var win = new Window (ui, titleStr ); // new window object with UI resource
// THEORETICALLY match our dialog background color to the host application
win.graphics.backgroundColor = win.graphics.newBrush (win.graphics.BrushType.THEME_COLOR, "appDialogBackground");
// poor, low, medium, high, max
var MenuQualityToGridPoints = [ 8, 16, 32, 64, 256 ];
function GridPointsToQualityMenuIndex( num )
var menu = MenuQualityToGridPoints;
var menuItems = menu.length;
if (num <= menu[0])
return 0;
if (num >= menu[ menuItems-1 ])
return (menuItems-1);
for (var i = 0; i < (menuItems-1); ++i)
if ((num >= menu[i]) && (num < menu[i+1]))
return i;
return 0; // just in case of a logic failure
// insert our localized strings
var drop = win.gp.qual.g2.drp; // for easier typing
drop.add('item', sPoor ); // 0
drop.add('item', sLow ); // 1
drop.add('item', sMedium ); // 2
drop.add('item', sHigh ); // 3
drop.add('item', sMaximum ); // 4
drop.selection = drop.items[2]; // Medium
win.gp.description.st.text = sDescription;
win.gp.copyright.st.text = sCopyright;
win.gp.qual.text = sQuality;
win.gp.qual.g2.st.text = sGridPoints;
win.gp.options.text = sFormatsToSave;
win.gp.options.ck3DL.text = sSave3DL;
win.gp.options.ckCUBE.text = sSaveCUBE;
win.gp.options.ckCSP.text = sSaveCSP;
win.gp.options.ckICC.text = sSaveICC;
win.gButtons.okBtn.text = sOpenButton;
win.gButtons.cancelBtn.text = sCancelButton;
// set starting parameters
win.gp.description.et.text = gDescription;
win.gp.copyright.et.text = gCopyright;
win.gp.options.ckICC.value = gDoSaveICCProfile;
win.gp.options.ck3DL.value = gDoSave3DL;
win.gp.options.ckCUBE.value = gDoSaveCUBE;
win.gp.options.ckCSP.value = gDoSaveCSP;
// global flag/hack to keep the UI pretty
var gGlobalPreventChanges = false;
with (win.gp.qual)
sl.value = gGridPoints;
g2.et.text = gGridPoints;
drop.selection = drop.items[ GridPointsToQualityMenuIndex(gGridPoints) ];
// global flag is ugly, but recursive change calls are uglier
g2.et.onChange = function () { if (gGlobalPreventChanges) { return; }
gGlobalPreventChanges = true;
var val = Number(this.text);
this.parent.parent.sl.value = val;
drop.selection = drop.items[ GridPointsToQualityMenuIndex(val) ];
gGlobalPreventChanges = false; };
sl.onChanging = function () { if (gGlobalPreventChanges) { return; }
gGlobalPreventChanges = true;
var val = Math.floor(this.value);
this.parent.g2.et.text = val;
drop.selection = drop.items[ GridPointsToQualityMenuIndex(val) ];
gGlobalPreventChanges = false; };
// DEFERRED - we should also set the value if the same menu item is selected again (reset)
// but the JSX toolkit doesn't support that
drop.onChange = function()
if (gGlobalPreventChanges) { return; }
gGlobalPreventChanges = true;
var theSelection = this.selection.text;
if (theSelection != null) { // only change if selection made
var theSelectionIndex = this.selection.index;
var newGridPoints = MenuQualityToGridPoints[ theSelectionIndex ];
win.gp.qual.g2.et.text = newGridPoints;
win.gp.qual.sl.value = newGridPoints;
gGlobalPreventChanges = false;
win.onShow = function ()
this.qual.sl.size.width = 128;
this.layout.layout(true);
win.gButtons.cancelBtn.onClick = function () { this.window.close(2); };
// validate inputs when the user hits OK
var gInAlert = false;
win.gButtons.okBtn.onClick = function ()
if (gInAlert == true)
gInAlert = false;
return;
var gridText = win.gp.qual.g2.et.text;
var w = Number(gridText);
var inputErr = false;
if ( isNaN( w ) )
if ( DialogModes.NO != app.playbackDisplayDialogs )
gInAlert = true;
alert( strTextInvalidType );
gInAlert = false;
win.gp.qual.g2.et.text = sGridDefault;
win.gp.qual.sl.value = sGridDefault;
drop.selection = drop.items[ GridPointsToQualityMenuIndex(sGridDefault) ];
return false;
if ( (w < sGridMin) || (w > sGridMax) )
if ( DialogModes.NO != app.playbackDisplayDialogs )
gInAlert = true;
alert( strTextInvalidNum );
gInAlert = false;
if ( w < sGridMin)
inputErr = true;
drop.selection = drop.items[ GridPointsToQualityMenuIndex(sGridMin) ];
win.gp.qual.g2.et.text = sGridMin;
win.gp.qual.sl.value = sGridMin;
return false;
if ( w > sGridMax)
inputErr = true;
drop.selection = drop.items[ GridPointsToQualityMenuIndex(sGridMax) ];
win.gp.qual.g2.et.text = sGridMax;
win.gp.qual.sl.value = sGridMax;
return false;
if (inputErr == false)
win.close(true);
return;
win.center(); // move to center the dialog
var ret = win.show(); // dialog display
if (2 == ret)
return false; // user cancelled
// user hit OK, copy values from dialog
gDescription = win.gp.description.et.text;
gCopyright = win.gp.copyright.et.text;
gGridPoints = win.gp.qual.sl.value;
gDoSave3DL = win.gp.options.ck3DL.value;
gDoSaveCUBE = win.gp.options.ckCUBE.value;
gDoSaveCSP = win.gp.options.ckCSP.value;
gDoSaveICCProfile = win.gp.options.ckICC.value;
// if no files are going to be saved, then we have zero work to do
if ((false == gDoSaveICCProfile) && (false == gDoSave3DL) &&
(false == gDoSaveCUBE) && (false == gDoSaveCSP) )
// tell the user that no formats were selected
alert( strNoExportsSelected );
gScriptResult = 'cancel'; // quit, returning 'cancel' (dont localize) makes the actions palette not record our script
return false;
// prompt user for directory and output base filename
// default to directory and filename of current document
var currentDocumentPath
try
// if the file has no path (not saved), then this throws
var documentPath = app.activeDocument.fullName.fsName; // Get the OS friendly file path and name
documentPath = documentPath.replace(/\....$/,''); // remove extension, if there is one
documentPath = documentPath + ".lut"; // add dummy extension
currentDocumentPath = File ( documentPath );
catch (e)
// if there was no document path, default to user's home directory
var defaultName = "~/" + strUntitledLUT;
currentDocumentPath = File(defaultName);
var fname = currentDocumentPath.saveDlg(strExportPrompt);
if (fname == null)
return false;
gSaveFilePath = fname.fsName;
return true;
function doExportLUTs( path )
const keyUsing = charIDToTypeID( 'Usng' );
const eventExport = charIDToTypeID( 'Expr' );
var desc = new ActionDescriptor();
var desc2 = new ActionDescriptor();
desc2.putString( keyFilePath, path );
desc2.putString( keyDescription, gDescription );
desc2.putInteger( keyDataPoints, gGridPoints );
// assemble the full copyright string, if needed
var copyrightAssembled = gCopyright;
if (gCopyright != "")
var theDate = new Date();
// the year is from 1900 ????
var theYear = (theDate.getYear() + 1900).toString();
// Localization team says to just use the year
var dateString = theYear;
copyrightAssembled = localize("$$$/JavaScripts/Export3DLUT/Copyright=(C) Copyright ") + dateString + " " + gCopyright;
desc2.putString( keyCopyright, copyrightAssembled );
// select output format
desc2.putBoolean( keyWriteICC, gDoSaveICCProfile );
desc2.putBoolean( keyWrite3DL, gDoSave3DL );
desc2.putBoolean( keyWriteCUBE, gDoSaveCUBE );
desc2.putBoolean( keyWriteCSP, gDoSaveCSP );
desc.putObject( keyUsing, keyExportLUT, desc2 );
try
var resultDesc = executeAction( eventExport, desc, DialogModes.NO );
catch (e)
if ( e.number != 8007 ) { // don't report error on user cancel
var str = localize("$$$/JavaScripts/Export3DLUT/ExportLUTFailed=Unable to run the Export Color Lookup plugin because ");
alert( str + e + " : " + e.line );
gScriptResult = 'cancel'; // quit, returning 'cancel' (dont localize) makes the actions palette not record our script
return false;
return true;
function doRenderGrid( points )
// call the grid rendering plugin to do the work
const keyRenderGrid = charIDToTypeID( "3grd" );
const keyDataPoints2 = charIDToTypeID( 'grdP' );
var args = new ActionDescriptor();
args.putInteger( keyDataPoints2, points);
try
var result = executeAction( keyRenderGrid, args, DialogModes.NO );
catch (e)
if ( e.number != 8007 ) { // don't report error on user cancel
var str = localize("$$$/JavaScripts/Export3DLUT/RenderGridFailed=Unable to render color grid because ");
alert( str + e + " : " + e.line );
gScriptResult = 'cancel'; // quit, returning 'cancel' (dont localize) makes the actions palette not record our script
return false;
return true;
function resizeDocumentInPixels( width, height )
var myDocument = app.activeDocument;
var originalRulerUnits = app.preferences.rulerUnits;
app.preferences.rulerUnits = Units.PIXELS;
myDocument.resizeCanvas( width, height, AnchorPosition.MIDDLECENTER)
app.preferences.rulerUnits = originalRulerUnits;
function GetColorSettings()
var desc1 = new ActionDescriptor();
var ref1 = new ActionReference();
ref1.putProperty( classProperty, kcolorSettingsStr );
ref1.putEnumerated( classApplication, typeOrdinal, enumTarget );
desc1.putReference( typeNULL, ref1 );
var result = executeAction( eventGet, desc1, DialogModes.NO );
var desc2 = result.getObjectValue( kcolorSettingsStr );
return desc2;
function GetColorConversionDitherState()
var settings = GetColorSettings();
if (settings.hasKey(kDither))
return settings.getBoolean( kDither );
else
return null;
function ConvertTo16Bit()
const eventConvertMode = charIDToTypeID( 'CnvM' );
const keyDepth = charIDToTypeID( 'Dpth' );
var modeDesc16Bit = new ActionDescriptor();
modeDesc16Bit.putInteger( keyDepth, 16 );
var result = executeAction( eventConvertMode, modeDesc16Bit, DialogModes.NO );
// state = true or false
function SetColorConversionDither( state )
var desc1 = new ActionDescriptor();
var ref1 = new ActionReference();
ref1.putProperty( classProperty, kcolorSettingsStr );
ref1.putEnumerated( classApplication, typeOrdinal, enumTarget );
desc1.putReference( typeNULL, ref1 );
var desc2 = new ActionDescriptor();
desc2.putBoolean( kDither, state );
desc1.putObject( keyTo, kcolorSettingsStr, desc2 );
executeAction( eventSet, desc1, DialogModes.NO );
function PurgeClipboard()
var desc1 = new ActionDescriptor();
desc1.putEnumerated( typeNULL, typePurgeItem, enumClipboard );
var result = executeAction( eventPurge, desc1, DialogModes.NO );
// This helps us avoid resizing existing document views in tabbed document mode.
// This is new functionality, and will not work in older Photoshop versions.
function MoveDocumentToNewWindow()
var desc1 = new ActionDescriptor();
var result = executeAction( kFloatWindowStr, desc1, DialogModes.NO );
function main()
try
var tempDoc = null;
var tempDoc2 = null;
// do basic troubleshooting first
// make sure there is a document
if (!app.activeDocument)
gScriptResult = 'cancel'; // quit, returning 'cancel' (dont localize) makes the actions palette not record our script
return;
// check the document mode
var mode = app.activeDocument.mode;
if (mode != DocumentMode.RGB
&& mode != DocumentMode.LAB
&& mode != DocumentMode.CMYK)
var str = localize("$$$/JavaScripts/Export3DLUT/UnsupportedColorMode=Could not export Color Lookup Tables because only RGB, LAB, and CMYK color modes are supported.");
alert(str);
gScriptResult = 'cancel'; // quit, returning 'cancel' (dont localize) makes the actions palette not record our script
return;
// check the document depth, for safety
var depth = app.activeDocument.bitsPerChannel; // an object, not a number - why? I have no idea...
var bitsPerChannel = 1;
if (depth == BitsPerChannelType.EIGHT)
bitsPerChannel = 8;
else if (depth == BitsPerChannelType.SIXTEEN)
bitsPerChannel = 16;
else if (depth == BitsPerChannelType.THIRTYTWO)
bitsPerChannel = 32;
else
var str = localize("$$$/JavaScripts/Export3DLUT/UnsupportedImageDepth=Could not export Color Lookup Tables because only 8, 16, and 32 bits/channel are supported.");
alert(str);
gScriptResult = 'cancel'; // quit, returning 'cancel' (dont localize) makes the actions palette not record our script
return;
// Check layer types: background plus adjustments only
// For now, don't check each layer - a multiply solid layer still works as a color adjustment, as does selective blending
// Users will get odd results from other layer types (layer masks, pixel layers, etc.)
try
app.activeDocument.backgroundLayer.visible = true;
catch (e)
if (activeDocument.layers.length == 1)
alert( localize("$$$/JavaScripts/Export3DLUT/NoAdjustmentLayers=Could not export Color Lookup Tables because this document has no adjustment layers.") );
else
alert( localize("$$$/JavaScripts/Export3DLUT/NoBackground=Could not export Color Lookup Tables because this document has no background.") );
gScriptResult = 'cancel'; // quit, returning 'cancel' (dont localize) makes the actions palette not record our script
return;
// look for last used params via Photoshop registry, getCustomOptions will throw if none exist
try
var desc = app.getCustomOptions(kScriptOptionsKey);
readOptionsFromDescriptor( desc );
catch(e)
// it's ok if we don't have any existing options, continue with defaults
// set some values from the document
gDescription = app.activeDocument.name;
// ask the user for options, bail if they cancel at any point
if ( doExportUI() == false)
gScriptResult = 'cancel'; // quit, returning 'cancel' (dont localize) makes the actions palette not record our script
return;
// we're good to go, so save our parameters for next time
app.putCustomOptions(kScriptOptionsKey, createDescriptorFromOptions() );
// remove file extension from filePath, if there is one
gSaveFilePath = gSaveFilePath.replace(/\....$/,'');
// calculate the size of image we need
var width = gGridPoints * gGridPoints;
var height = gGridPoints;
if (mode == DocumentMode.CMYK)
height = gGridPoints*gGridPoints;
// duplicate the user document so we don't mess it up in any way
tempDoc = app.activeDocument.duplicate("temporary");
// make the temporary document active
app.activeDocument.name = tempDoc;
// to avoid resizing existing document views in tabbed mode
MoveDocumentToNewWindow();
// convert 8 bit documents to 16 bit/channel for improved quality of merged adjustments
if (bitsPerChannel == 8)
ConvertTo16Bit();
depth = BitsPerChannelType.SIXTEEN;
// resize the temporary canvas to our target size
resizeDocumentInPixels( width, height )
// select background layer
tempDoc.activeLayer = tempDoc.backgroundLayer;
// render lookup base grid
var worked = doRenderGrid( gGridPoints );
if (worked != true)
tempDoc.close( SaveOptions.DONOTSAVECHANGES );
return; // error should have already been shown, and there is not much we can do
// do not flatten here -- the export automatically gets flattened data
// and we may need layers for LAB->RGB conversion below
// export the chosen formats
worked = doExportLUTs( gSaveFilePath );
if (worked != true)
tempDoc.close( SaveOptions.DONOTSAVECHANGES );
return; // error should have already been shown, and there is not much we can do
// for LAB documents to export 3DLUT (which are inherently RGB), we have to do additional work
// As a bonus, this works for CMYK as well!
if ( mode != DocumentMode.RGB )
var filePath = gSaveFilePath + "RGB";
var oldDitherState = GetColorConversionDitherState();
try
SetColorConversionDither(false);
const targetProfileName = "sRGB IEC61966-2.1";
// new document temp2 in sRGB, matching depth of original
var originalRulerUnits = app.preferences.rulerUnits;
app.preferences.rulerUnits = Units.PIXELS;
tempDoc2 = app.documents.add( width, gGridPoints, 72, "temp2",
NewDocumentMode.RGB, DocumentFill.WHITE,
1.0, depth, targetProfileName );
app.preferences.rulerUnits = originalRulerUnits;
// make the new doc active
app.activeDocument.name = tempDoc2;
// to avoid resizing existing document views in tabbed mode
MoveDocumentToNewWindow();
// insert grid
worked = doRenderGrid( gGridPoints );
if (worked == true)
tempDoc2.selection.selectAll();
tempDoc2.activeLayer = tempDoc2.backgroundLayer;
tempDoc2.selection.copy();
tempDoc2.close( SaveOptions.DONOTSAVECHANGES );
tempDoc2 = null;
// make sure temp1 is active
app.activeDocument.name = tempDoc;
// resize for RGB grid
resizeDocumentInPixels( width, gGridPoints );
tempDoc.selection.selectAll();
tempDoc.paste(true);
PurgeClipboard(); // so we don't leave an odd, large item on the clipboard
tempDoc.selection.deselect();
tempDoc.flatten();
// convert temp1 to sRGB
tempDoc.convertProfile( targetProfileName, Intent.RELATIVECOLORIMETRIC, true, false );
// export the chosen formats
worked = doExportLUTs( filePath );
// at this point we still have to clean up, even if the call failed, so fall through
else
tempDoc2.close( SaveOptions.DONOTSAVECHANGES );
catch (e)
if ( e.number != 8007 ) { // don't report error on user cancel
var str = localize("$$$/JavaScripts/Export3DLUT/UnableToConvertRGB=Unable to convert image to RGB because ");
alert( str + e + " : " + e.line );
if (tempDoc2 != null) tempDoc2.close( SaveOptions.DONOTSAVECHANGES );
gScriptResult = 'cancel'; // quit, returning 'cancel' (dont localize) makes the actions palette not record our script
// always reset the dither state
SetColorConversionDither( oldDitherState );
PurgeClipboard(); // so we don't leave an odd, large item on the clipboard
} // if not RGB
// always close temp document without saving
tempDoc.close( SaveOptions.DONOTSAVECHANGES );
catch (e)
if ( e.number != 8007 ) { // don't report error on user cancel
var str = localize("$$$/JavaScripts/Export3DLUT/UnableToExport=Unable to export LUT because ");
alert( str + e + " : " + e.line );
// always close temp document without saving
if (tempDoc != null) tempDoc.close( SaveOptions.DONOTSAVECHANGES );
gScriptResult = 'cancel'; // quit, returning 'cancel' (dont localize) makes the actions palette not record our scriptHi blabla12345,
(untested and without warranty)
replace this line:
const sSaveCUBE = "CUBE";
with this:
const sSaveCUBE = "cube";
Have fun -
Data mart cube to cube copy records are not matching in target cube
hI EXPERTS,
Need help on the below questions for DATA Mart-Cube to cube copy(8M*)
Its BW 3.5 system
We have two financial cube.
Cube A1 - Sourced through R/3 system (delta update) and Cube B1- Sourced through A1 cube.(Full update). These two cubes are connected through update rules with one to one mapping without any routines.Basis did a copy of back-end R/3 system from Production to Quality server.This happened approximately 2 months back.
The Cube A1 which extracts delta load from R/3 is loading fine. but for the second cube, (extraction from previous cube A1) i am not getting full volume of data instead i m getting meagre value but the loading shows successful status in the monitor.
We tried through giving conditions in my infopackage (as it was given in previous year's loading) but then also its fetching the same meagre volume of data.
To ensure that is it happening for the particular cube, we tried out in other cube which are sourced thro myself system and that is also getting meagre data rather than full data..
For Example: For an employee if the data available is 1000, the system is extracting randomly some 200 records.
Any quick reply will be more helpful. ThanksHi Venkat,
Did you do any selective delitions in CUBEA1.
first reconcile data cube1 & cube2 .
match totals of cube1 with cube2.
Thanks,
Vijay. -
Cube to cube data transferred ,and delta status
Hi Gurus
I need to do Portioning in cube
I have a cube Zaa_c01 which is updated by two data source ZDS1 and ZDS2 (Delta Update) I copy the Zaa_c01 and I created new cube Zaa_c02, also I transferred the data to that. But it is shoeing in single request in cube, after partion I again reloaded to ZAA_C01 , know I am seeing only single request in a cube.
Whether I lose delta in this case , or I am missing any step, this activity I done in Development, and need to do in production.
Regards
ShashiHai Shashidhar,
Partitioning is split the large dimensional cube into smaller cubes.... regarding performance point of view we have to do this partitioning.
there are two types of partitioning..
1. Logical Partitioning.
2.Physical Partitioning
physical partitioning is done at Database level . and logical partitioning done at Data target level.(EX: Info cube level)
Cube partitioning with time characteristics 0calmonth Fiscper is physical partitioning.(so, we need at leats one of the above time charecterstics for doing partioning...).
Logical partitioning is u partition ur cube by year or month that is u divide the cube into different cubes and create a multiprovider on top of it.
STEPS :
SELECT YOUR CUBE.. AND DOUBLE CLICK IT..->
in the menu's select EXTRAS menu--> in that select -->DB PERFORMANCE sub menu..-> in that again select Partitioning sub menu..
and specify how many partions you want to do...
but you cube should be contain atleast ocalmonth or phical varient should be there...
Prerequisites
You can only partition a dataset using one of the two partitioning criteria calendar month (0CALMONTH) or fiscal year/period (0FISCPER). At least one of the two InfoObjects must be contained in the InfoProvider.
http://help.sap.com/saphelp_nw04s/helpdata/en/8c/131e3b9f10b904e10000000a114084/frameset.htm
Thanks,
kiran. -
Cube to Cube "Error in data Selection" plz help
Hi this is Ajay Reddy
when am sending data from Cube to Cube am getting an error
"Error in data Selection" plz helpHi,
It needs a Note implementaion. I got similar problem in version BIW 7 and SP 11. We got the solution with on of the note . I am not able to recollect the note number.
Any way check the notes 920971,155471 may help you or not.
With rgds,
Anil Kumar Sharma .P -
Cube to cube datamart in 3.5
I have built a cube to cube datamart and cannot determine how to initiate the load. When building a datamart with an ODS as the source, it is quite obvious (right-click ODS and select Update ODS Data in Data Target). However, this option does not exist in the case of a cube.
It's a one-time load of some historical data. I don't require scheduling of deltas. Just a single full load.I do however need selectability on a date field to exclude recent data.
Can someone please advise how to initiate this data load?
Many thanks,
BrettHi,
Go to the infosource tab in the RSA1 and search for 8(your infocube technical name).
Eg: if your source infocube 0tct_01 then search for 80tct_c01.
Create an infopackage on the 8.* ICtechnical name and schedule the load. This will load data from your source IC to Target IC.
If it is not clear please let me know.
Thanks
Santosh R C -
Delta update from cube to cube
How will be delta calculated for Cube to cube load, as in case of ODS changelog table.
Thanks in Adv
Shrish.Hi
Regarding Generic Extraction:
You can find the Safety interval in R/3 -> RSO2 -> Data source -> Delta. Generally if the delta is based on calday option then it is 1 day and for timestamp based it depends on the data load frequency.
this is from help.
Safety Interval Upper Limit of Delta Selection
This field is used by DataSources that determine their delta generically using a repetitively-increasing field in the extract structure.
The field contains the discrepancy between the current maximum when the delta or delta init extraction took place and the data that has actually been read.
Leaving the value blank increases the risk that the system could not extract records arising during extraction. -
Cube to Cube vs ODS to Cube good for loading performance
BW Experts,
I have an option of loading from Cube to Cube or ODS to Cube. It has data in 20 m records.
So in terms of performance of loading perspective, which one is better..
Thanks in advance,
BWerHi,
of course (but you didn't mention that the level of detail was different) it's depending on if there's alot more data in the DSO than in the cube.
A DSO is 1 table which is easier to extract data from and build additional indexes on to speed up load time.
The Info cube is as you know more compelx built with the multi dimensional model.
Kind regards
/martin -
What is the relation ship b/w dimentions of cube , char cube?
what is the relation ship b/w dimentions of cube , char cube? meas is it one-one,one-many,many-one?
Hi.....
Ideally the characteristics in a dimension are 1:n........as already explained.......
But how to deal with the characteristics that need to be placed in a dimension with n:m relationship..........
It is always better to put unrelated Characteristics having n:m relationships in differnt dimensions, else the size of dimension and fact tables would nearly be the same, which is not advisable........
As usual it depends. If you have enough dimensions available you normally put them into different dimensions. If you have lots of characteristics you look at the expected number of values in the characteristics and you put those characteristics into one dimension that have small amounts of values like version, value type, currency type. Sometimes you also put characteristics together that are unrelated but need to be logically combined like cost element and credit/debit indicator.
Or your characteristics are slightly correlated (what you may call 1.5:n relation).........
Hope this helps.......
Thanks==Points as per SDN..............;)
Regards,
Debjani......... -
DTP - Load from cube to cube - Setting "Use aggregates"
Hi all,
what exactly does the setting "Use aggregates" on the extraction tab in a DTP which loads data from Cube to Cube mean? The F1-Help didn't really bring me any further...
Thanks to any answers in advance!
Kind regards,
PhilippHi,
U can update data from from one data target to other data target through data marting.
right click ur ODS or Cube from where u r updating data to other target and select the Generate export data source. then it will create a data source with the name 8forllowed by ur data taget name. there after assign ur data source to ur infosource. and then create Update b/n the targets and then select the update data to other data target from where u r updating.
select the type of update. and then load it from ur IP.
regards-
MM -
Inventory loads to cube 0IC_C03
Hi Guys ,
I am loading data to Inventory Cube 0IC_C03 .I have done all the steps .I have 1 question ie ..intilally i loaded data to cube from 2LIS_03_BF data source .I did the following steps :
u2022 Init Delta (2LIS_03_BF) with data transfer.
u2022 Release Request with No Marker Update (Tick).
Now from tomorro we will get delta loads .My question is do we need to compress delta request with NO MARKET UPDATE (TICK) ?
Or
Do i need to unTick The option NO MARKER UPDATE .
The data is very critical here .Please let me know if you have the right answer .
Regards
SantoshHi,
Pl look at below
Marker Update is used to reduce the time of fetching the non-cumulative key figures while reporting. It helps to easily get the values of previous stock quantities while reporting. The marker is a point in time which marks an opening stock balance. Data up to the marker is compressed.
The No Marker Update concept arises if the target InfoCube contains a non-cumulative key figure. For example, take the Material Movements InfoCube 0IC_C03 where stock quantity is a non-cumulative key figure. The process of loading the data into the cube involves in two steps:
1) In the first step, one should load the records pertaining to the opening stock balance/or the stock present at the time of implementation. At this time we will set marker to update (uncheck 'no marker update') so that the value of current stock quantity is stored in the marker. After that, when loading the historical movements (stock movements made previous to the time of implementing the cube) we must check marker update so that the marker should not be updated (because of these historical movements, only the stock balance / opening stock quantity has been updated; i.e. we have already loaded the present stock and the aggreagation of previous/historical data accumulates to the present data).
2) After every successful delta load, we should not check marker update (we should allow to update marker) so that the changes in the stock quantity should be updated in the marker value. The marker will be updated to those records which are compressed. The marker will not be updated to those uncompressed requests. Hence for every delta load request, the request should be compressed.
Check or uncheck the Marker Option:
Compress the request with stock marker => uncheck the marker update option.
Compress the loads without the stock maker => check the marker update option.
Relevant FAQs:
1) The marker isn't relevant when no data is transferred (e.g. during an delta init without data transfer).
2) The marker update is just like a check point (it will give the snapshot of the stock on a particular date when it is updated).
Reference information:
Note 643687 Compressing non-cumulative InfoCubes (BW-BCT-MM-BW)
Note 834829 Compression of BW InfoCubes without update of markers (BW-BEX-OT-DBIF-CON)
Note 745788 Non-cumulative mgmnt in BW: Verifying and correcting data (BW-BCT-MM-BW)
Note 586163 Composite Note on SAP R/3 Inventory Management in SAP BW (BW-BCT-MM-IM)
Thanks and regards -
Business Content Inventory Management (0IC_C03) Cube
I am thinking of adding a data staging ODS to our Inventory Management model. However, I wanted to check whether the delivered cube 0IC_C03 is aggregated or whether it is a representation of the R3 extraction. If the latter is the case I do not see the purpose of an additional ODS layer. Could anyone pass their thoughts? Thanks
hi Niten,
check oss note 581778-ODS capability of extractors from Inventory Management
Symptom
Data is not updated into the ODS when you load certain data records of the 2LIS_03_BF extractor.
Other terms
ODS, 2LIS_03_BF, 2LIS_03_UM, 2LIS_03_BX, 2LIS_40_S278, 0STORNO, ROCANCEL, RODMUPDMOD, 0RECORDMODE, Inventory Management, 0IC_C03,
Application 03, Logistics Extraction Cockpit, LBIW
Reason and Prerequisites
The 2LIS_03_BF extractor does not return any "connected" after and before images, as in the SD extraction, for example.
Originally, the extractor was only designed for updating into an InfoCube. The extraction method was converted from the "D" delta process to "ABR1" with PI 2000.1 patch 4, or PI-A 2000.1 patch 4 so that it could be updated into an ODS object (see notes 322303, 385703 and 323811).
If documents are canceled in Inventory Management, a new document is generated (new document and item number).
This type of operation is transferred into BW with the "ROCANCEL" field set to 'X'. This results in the necessary inversion of quantities or value key figures. A new record is generated for this in the ODS (the "preceding document" has another key).
Example:
ODS key: Doc.
Movement data:
Document Item Document year Movement type Qty ROCANCEL
4711 1 2002 101 +100
4712 1 2002 102 -100 X
ODS contents:
Document Item. Document year Movement type Quantity
4711 1 2002 101 +100
4712 1 2002 -100
The "Movement type" field was set to "initial" in the data record for document "4712" by the processing during the ODS update.
Document 4712 is a before image (0RECORDMODE = 'X') for the ODS. Therefore, this data does not arrive in the ODS. Nevertheless, the before image (document 4712) does not have any context with the after image (document 4711) because there are different keys (document/item/document year). The SAP R/3 inventory management controls these activities with new material documents and (reversal) movement types.
BW technology note no. 747037 provides more information.
Solution
You have two options to evaluate cancellations or operations which contain reversal movement types and ROCANCEL = "X" in the ODS:
1. Implement a start routine
Insert the following code into the start routine of your update:
ODS key: Material doc.|Item number|Calendar year
BEGIN******************************
LOOP AT DATA_PACKAGE.
if DATA_PACKAGE-recordmode eq 'X'.
DATA_PACKAGE-recordmode = ''.
modify DATA_PACKAGE.
endif.
endloop.
END********************************
2. Change the transfer rules
With the transfer rules, do not assign the "ROCANCEL" DataSource field to the "0RECORDMODE" InfoObject in the InfoSource. As a result, only records with "0RECORDMODE" = ' ' are transferred to the ODS. For the ODS, the after images whose characteristics or key figures that are set to overwrite, are not deleted.
Other special features when updating the extractors of the inventory management into ODS objects:
1. 2LIS_03_BF and 2LIS_03_UM
a) ODS capability
For more information, see notes 322303, 323811 and 385703.
b) Key creation
For 2LIS_03_BF, see note 417703.
The following keys are available for 2LIS_03_UM:
1. MCMSEG-BUKRS
2. MCMSEG-BELNR
3. MCMSEG-GJAHR
4. MCMSEG-BUZEI
Field no. 4 is not delivered as standard in the extract structure, but it can be added using the Logistics extract structures Customizing Cockpit.
Field no. 1 may not be included in the relevant organizational structure.
2. 2LIS_03_BX (up to and including 3.0B 2LIS_40_S278)
The dynamic stock calculation from stocks (2LIS_03_BX) and movements (2LIS_03_BF) is only possible with (non-cumulative) InfoCubes.
Using ODS technology is only useful in this context for:
- an Enterprise DataWarehouse layer
- the realization of a snapshot scenario:
See the How to paper for this topic at:
-> http://service.sap.com/bw
-> left navigation area: SAP BW InfoIndex
-> info index: N ("Non-cumulatives") ->
"How to... handle inventory management scenarios in SAP BW 3.x" (pdf)
Use note 773823 to update the 2LIS_03_BX InfoSource into an ODS.
Note also that you cannot update any key figures into an ODS object (see notes 752492 and 782314).
Maybe you are looking for
-
NO GR required for non stock PR/PO
My client has a concern with their business users selecting item category and account category during creation of a non-stock PR. The solution that the client prefers is to configure one item category and one account assignment category combination p
-
Installing 10.1.2.0.2
Hi Has anybody installed the 10.1.2.0.2 version of the app server with Forms, Reports and Discoverer. Did everything went ok? Any problems in the installation or configuration process?
-
Avoiding choosing any vendor like partner functions in PO
For a vendor I have several partner functions in the Vendor Master. And several registers for each one. For example: Vendor 1 has defined this partners functions Vendor 0001 Ordering address 0001 Ordering address 0002 But in the pu
-
ESS - Links issue for Appraisal
My end-users are unable to access the appraisal forms on the ESS portal, the link just does not open the form. We have been unable to resolve this issue. can anybody suggest anything? Best Regards Pallavi
-
Using DW to create a dynamic, grid-based website
I am a novice to this whole web design thing but I have big dreams of accomplishing a big goal and designing my own website. I am aiming to create a site similar in concept and design to this one: http://dynamit.us I am looking to replicate the multi