Ascential Datastage error
Hi All,
I am getting the error while loading data from Ascential.
There is red color error request in PSA and yellow color string in Request update column.
The error message in Details Tab Is :
Error in Data Request
When I click on the messages it is displaying as:
BW Load : Critical Failure, no job name for PULL job.
One more error :
Ex=1 Error when opening an RFC connection
Please suggest me anything.
Thanks in advance!
I was having a similar error while loading the data to BW , I option i was using was PUSH method
One thing is RFC connection should be cheked if you go to SAP gui and right click on the Source system and say check the connection should be successful and if not the connection between the Ascential and BW is not proper we have to correct that and the error in request
I have browsed for that and the only answer that i get for such issue is the patch should be installed for the Datastage for the compatibility so we have to install the patch 4.3.2 version
Thanks
VJ
Similar Messages
-
Load Failure from Ascential DataStage to BW
Dear All,
When I connect from Ascential DataStage to SAP BW, the load fails and in the BW I see the error message
"S:RSVAR:051 Error opening InfoSource file: FPOCLE_MASTERTEXT.TXT"
What could be the cause of the problem.
The loads started failing after we have installed SAPBW pack 4.2.1 on datastage server.
Please help me to resolve this problem.
Many Thanks for the support.
Regards,
Satya Raj Kumar.Hi Jkyle,
Yaah this setup was running absolutely fine before as ......
Since 30th I see no data in the PSA and in the monitor I see the messages:
The error message for 30th ,2nd and 4th.
Under technical status:
Request REQU_AW5OAXR6OL22AO4XMNW9K7CTX (71,936 ) has not or not correctly been update
Under details:
Requests (messages): Errors occurred
Data request arranged
Error in the data request
Extraction (messages): Missing messages
Transfer (Idocs and TRFC): Missing messages
Processing (data packet): No data
For 1st and 3rd May I see the in SM37 as the JOB
cancelled with the message in the log as
BI_BTCH_BILLING_ITEM_DELTA DIAGONL01 Cancelled <b>01.05.2005</b> 04:30:04 4 4
BI_BTCH_BILLING_ITEM_DELTA DIAGONL01 Cancelled <b>03.05.2005</b> 04:30:04 3 4
Job Log:
Job started
Step 001 started (program RSBATCH1, variant , user name DIAGONAL01)
Start InfoPackage ZPAK_C07V5RINLPI9WW114MO91BXCC
The last delta load is still running; no new request possible
The last delta load is still running; no new request possible
Job cancelled after system exception ERROR_MESSAGE
I checked SM59 there is no connection problem.
Please advise.
Thanks and regards,
Priya. -
Error while loading data into BW using Datastage tool
We are extracting data from external source and loading it into SAP BW using Ascential DataStage. Sometimes the job starts on the DataStage side but doesn't really run. I get the following error msg on the BW side:
S:RSVAR:051 Error opening InfoSource file: ZTESTODS_TRANSACTION.TXT
Please suggest.
Thanks,
RRHi,
ZTESTODS_TRANSACTION.TXT and ZTESTODS_TRANSACTION.MET are two files in the dsbwconnections directory on the datastage server. The error occurs when a process has a lock on the file.
If you use PUSH or PULL then you won't see the problem.
Best regards
Niels -
"ORA-00054 Resource Busy Error" when running SQL*Loader in Parallel
Hi all,
Please help me on an issue. We are using Datastage which uses sql*loader to load data into an Oracle Table. SQL*Loader invokes 8 parallel sessions for insert on the table. When doing so, we are facing the following error intermittently:
SQL*Loader-951: Error calling once/load initialization
ORA-00604: error occurred at recursive SQL level 1
ORA-00054: resource busy and acquire with NOWAIT specifiedSince the control file is generated automatically by datastage, we cannot modify/change the options and test. Control File for the same is:
OPTIONS(DIRECT=TRUE, PARALLEL=TRUE, SKIP_INDEX_MAINTENANCE=YES)
LOAD DATA INFILE 'ora.2958.371909.fifo.1' "FIX 1358"
APPEND INTO TABLE X
x1 POSITION(1:8) DECIMAL(15,0) NULLIF (1:8) = X'0000000000000000',
x2 POSITION(9:16) DECIMAL(15,0) NULLIF (9:16) = X'0000000000000000',
x3 POSITION(17:20) INTEGER NULLIF (17:20) = X'80000000',
IDNTFR POSITION(21:40) NULLIF (21:40) = BLANKS,
IDNTFR_DTLS POSITION(41:240) NULLIF (41:240) = BLANKS,
FROM_DATE POSITION(241:259) DATE "YYYY-MM-DD HH24:MI:SS" NULLIF (241:259) = BLANKS,
TO_DATE POSITION(260:278) DATE "YYYY-MM-DD HH24:MI:SS" NULLIF (260:278) = BLANKS,
DATA_SOURCE_LKPCD POSITION(279:283) NULLIF (279:283) = BLANKS,
EFFECTIVE_DATE POSITION(284:302) DATE "YYYY-MM-DD HH24:MI:SS" NULLIF (284:302) = BLANKS,
REMARK POSITION(303:1302) NULLIF (303:1302) = BLANKS,
OPRTNL_FLAG POSITION(1303:1303) NULLIF (1303:1303) = BLANKS,
CREATED_BY POSITION(1304:1311) DECIMAL(15,0) NULLIF (1304:1311) = X'0000000000000000',
CREATED_DATE POSITION(1312:1330) DATE "YYYY-MM-DD HH24:MI:SS" NULLIF (1312:1330) = BLANKS,
MODIFIED_BY POSITION(1331:1338) DECIMAL(15,0) NULLIF (1331:1338) = X'0000000000000000',
MODIFIED_DATE POSITION(1339:1357) DATE "YYYY-MM-DD HH24:MI:SS" NULLIF (1339:1357) = BLANKS
)- it occurs intermittently. When this job runs, no one will be accessing the database or the tables.
- When we do not run in parallel, then we are not facing the error but it is very slow (obviously).Just in case, I am also attaching the Datastage Logs:
Item #: 466
Event ID: 1467
Timestamp: 2009-06-02 23:03:19
Type: Info
User Name: dsadm
Message: main_program: APT configuration file: /clu01/datastage/Ascential/DataStage/Configurations/default.apt
node "node1"
fastname "machine_name"
pools ""
resource disk "/clu01/datastage/Ascential/DataStage/Datasets" {pools ""}
resource scratchdisk "/clu01/datastage/Ascential/DataStage/Scratch" {pools ""}
node "node2"
fastname "machine_name"
pools ""
resource disk "/clu01/datastage/Ascential/DataStage/Datasets" {pools ""}
resource scratchdisk "/clu01/datastage/Ascential/DataStage/Scratch" {pools ""}
node "node3"
fastname "machine_name"
pools ""
resource disk "/clu01/datastage/Ascential/DataStage/Datasets" {pools ""}
resource scratchdisk "/clu01/datastage/Ascential/DataStage/Scratch" {pools ""}
node "node4"
fastname "machine_name"
pools ""
resource disk "/clu01/datastage/Ascential/DataStage/Datasets" {pools ""}
resource scratchdisk "/clu01/datastage/Ascential/DataStage/Scratch" {pools ""}
node "node5"
fastname "machine_name"
pools ""
resource disk "/clu01/datastage/Ascential/DataStage/Datasets" {pools ""}
resource scratchdisk "/clu01/datastage/Ascential/DataStage/Scratch" {pools ""}
node "node6"
fastname "machine_name"
pools ""
resource disk "/clu01/datastage/Ascential/DataStage/Datasets" {pools ""}
resource scratchdisk "/clu01/datastage/Ascential/DataStage/Scratch" {pools ""}
node "node7"
fastname "machine_name"
pools ""
resource disk "/clu01/datastage/Ascential/DataStage/Datasets" {pools ""}
resource scratchdisk "/clu01/datastage/Ascential/DataStage/Scratch" {pools ""}
node "node8"
fastname "machine_name"
pools ""
resource disk "/clu01/datastage/Ascential/DataStage/Datasets" {pools ""}
resource scratchdisk "/clu01/datastage/Ascential/DataStage/Scratch" {pools ""}
Item #: 467
Event ID: 1468
Timestamp: 2009-06-02 23:03:20
Type: Warning
User Name: dsadm
Message: main_program: Warning: the value of the PWD environment variable (/clu01/datastage/Ascential/DataStage/DSEngine) does not appear to be a synonym for the current working directory (/clu01/datastage/Ascential/DataStage/Projects/Production). The current working directory will be used, but if your ORCHESTRATE job does not start up correctly, you should set your PWD environment variable to a value that will work on all nodes of your system.
Item #: 468
Event ID: 1469
Timestamp: 2009-06-02 23:03:32
Type: Warning
User Name: dsadm
Message: Lkp_1: Input dataset 1 has a partitioning method other than entire specified; disabling memory sharing.
Item #: 469
Event ID: 1470
Timestamp: 2009-06-02 23:04:22
Type: Warning
User Name: dsadm
Message: Lkp_2: Input dataset 1 has a partitioning method other than entire specified; disabling memory sharing.
Item #: 470
Event ID: 1471
Timestamp: 2009-06-02 23:04:30
Type: Warning
User Name: dsadm
Message: Xfmer1: Input dataset 0 has a partitioning method other than entire specified; disabling memory sharing.
Item #: 471
Event ID: 1472
Timestamp: 2009-06-02 23:04:30
Type: Warning
User Name: dsadm
Message: Lkp_2: When checking operator: Operator of type "APT_LUTProcessOp": will partition despite the
preserve-partitioning flag on the data set on input port 0.
Item #: 472
Event ID: 1473
Timestamp: 2009-06-02 23:04:30
Type: Warning
User Name: dsadm
Message: SKey_1: When checking operator: A sequential operator cannot preserve the partitioning
of the parallel data set on input port 0.
Item #: 473
Event ID: 1474
Timestamp: 2009-06-02 23:04:30
Type: Warning
User Name: dsadm
Message: SKey_2: When checking operator: Operator of type "APT_GeneratorOperator": will partition despite the
preserve-partitioning flag on the data set on input port 0.
Item #: 474
Event ID: 1475
Timestamp: 2009-06-02 23:04:30
Type: Warning
User Name: dsadm
Message: buffer(1): When checking operator: Operator of type "APT_BufferOperator": will partition despite the
preserve-partitioning flag on the data set on input port 0.
Item #: 475
Event ID: 1476
Timestamp: 2009-06-02 23:04:30
Type: Info
User Name: dsadm
Message: Tgt_member: When checking operator: The -index rebuild option has been included; in order for this option to be
applicable and to work properly, the environment variable APT_ORACLE_LOAD_OPTIONS should contain the options
DIRECT and PARALLEL set to TRUE, and the option SKIP_INDEX_MAINTENANCE set to YES;
this variable has been set by the user to `OPTIONS(DIRECT=TRUE, PARALLEL=TRUE, SKIP_INDEX_MAINTENANCE=YES)'.
Item #: 476
Event ID: 1477
Timestamp: 2009-06-02 23:04:35
Type: Info
User Name: dsadm
Message: Tgt_member_idtfr: When checking operator: The -index rebuild option has been included; in order for this option to be
applicable and to work properly, the environment variable APT_ORACLE_LOAD_OPTIONS should contain the options
DIRECT and PARALLEL set to TRUE, and the option SKIP_INDEX_MAINTENANCE set to YES;
this variable has been set by the user to `OPTIONS(DIRECT=TRUE, PARALLEL=TRUE, SKIP_INDEX_MAINTENANCE=YES)'.
Item #: 477
Event ID: 1478
Timestamp: 2009-06-02 23:04:41
Type: Warning
User Name: dsadm
Message: Lkp_2,6: Ignoring duplicate entry at table record 1; no further warnings will be issued for this table
Item #: 478
Event ID: 1479
Timestamp: 2009-06-02 23:04:41
Type: Warning
User Name: dsadm
Message: Tgt_member_idtfr,0: SQL*Loader-951: Error calling once/load initialization
Item #: 479
Event ID: 1480
Timestamp: 2009-06-02 23:04:41
Type: Warning
User Name: dsadm
Message: Tgt_member_idtfr,0: ORA-00604: error occurred at recursive SQL level 1
Item #: 480
Event ID: 1481
Timestamp: 2009-06-02 23:04:41
Type: Warning
User Name: dsadm
Message: Tgt_member_idtfr,0: ORA-00054: resource busy and acquire with NOWAIT specified
Item #: 481
Event ID: 1482
Timestamp: 2009-06-02 23:04:41
Type: Warning
User Name: dsadm
Message: Tgt_member_idtfr,6: SQL*Loader-951: Error calling once/load initialization
Item #: 482
Event ID: 1483
Timestamp: 2009-06-02 23:04:41
Type: Warning
User Name: dsadm
Message: Tgt_member_idtfr,6: ORA-00604: error occurred at recursive SQL level 1
Item #: 483
Event ID: 1484
Timestamp: 2009-06-02 23:04:41
Type: Warning
User Name: dsadm
Message: Tgt_member_idtfr,6: ORA-00054: resource busy and acquire with NOWAIT specified
Item #: 484
Event ID: 1485
Timestamp: 2009-06-02 23:04:41
Type: Fatal
User Name: dsadm
Message: Tgt_member_idtfr,6: The call to sqlldr failed; the return code = 256;
please see the loader logfile: /clu01/datastage/Ascential/DataStage/Scratch/ora.23335.478434.6.log for details.
Item #: 485
Event ID: 1486
Timestamp: 2009-06-02 23:04:41
Type: Fatal
User Name: dsadm
Message: Tgt_member_idtfr,0: The call to sqlldr failed; the return code = 256;
please see the loader logfile: /clu01/datastage/Ascential/DataStage/Scratch/ora.23335.478434.0.log for details. -
Master Data loading using Ascential Data stage
Hi ,
We are loading master data using Ascential datastage.
I have created the infoobject ZMATNUM and when I am trying to create datasiurce using 3rd party source sytesm, I am getting then messsage
"DataSource ZMATNUM1(ASCENTLBI): Object type RSDS not supported in BAPI source system".
Please suggest anything how to make the infoobject ready in BW system ready to get loaded from datastage.
Regards.
SnigdhaHi,
Check the source system connectivity and try to create the datasouce.
Regards,
Venkat -
How to migrate the Objects from 3.1c to BI7
Hi,
We are in Functional Upgradation.
How to Migrate the Objects( Infocubes,DSO,Datasources,Rules...etc...) from 3.1C to BI 7.0
Please help me to doing this....
regards,
anilBW Upgrade tasks (BW 3.1C to BI 7.0)
Prepare Phase:
Task How-To Who
Review BI 7.0 feature lists Review BI 7.0 feature lists for possible inclusion in developments. Basis/BW
Obtain the BI 7.0 upgrade guide Download the upgrade guide from http://service.sap.com/inst-guides -> SAP NetWeaver -> Upgrade
Basis/BW
Review all upgrade SAP notes In addition to the upgrade guide, check, download, and review all SAP notes for your upgrade
BI 7.0 Upgrade notes
SAP Web Application Server 6.40 upgrade notes
OS and DB specific upgrade notes
SAP BW Add-on upgrade notes
(e.g. SAP SEM, ST-PI, etc)
Plug-In upgrade SAP notes
Other notes identified in above notes and/or upgrade guides.
Basis/BW
Check DB and OS requirements for the target SAP BW release Check DB version/patch level and OS version/patch level required for upgrade
First check the most current information from the SAP BW homepage http://Service.sap.com/BW -> <SAP BW release> -> Availability
Additionally, the u201CPlatformsu201D link will take you to the main DB/OS page for BI 7.0 and SAP Web AS 6.40.
Note: In some cases there are differing requirements for SAP BW 3.0B/SAP BW 3.1 Content and BI 7.0
Basis
Check SAP BW Add-on upgrade requirements Do you have SAP BW add-ons installed that require additional handling (e.g. SAP SEM, Enterprise Portal Plug-in, etc)?
SAP SEM (SAP BW based components) requires SAP SEM 4.0 which is part of the mySAP ERP 2004 suite.
WP-PI release must be at 6.00 before the upgrade begins. As mentioned before this add-on is merged with PI_Basis after the upgrade.
Basis
Check SAP BW upgrade requirements Minimum Support Package and kernel levels for upgrade
SAP BW Frontend requirements for new SAPGUI, SAP BW BEx Frontend and SAP BW Web applications.
Source system Plug-In requirements Basis
Check compatibility requirements with 3rd party software 3rd Party Reporting tools (example: Crystal)
ETL Tools (example: Ascential, DataStage, etc)
Scheduling tools (example. Control-M, Maestro, etc)
Monitoring tools (example: HP OpenView, Patrol, etc)
Other OS or DB related tools Basis
Check new component requirements for BI 7.0 If SAP BW web reports were developed in SAP BW 2.x, a windows version of IGS 6.40 (Internet Graphics Service) is required for conversion and future rendering of web graphics (i.e. Charts and GIS Maps).
The IGS chart migration will also be required after the SAP BW web report conversion.
If you used or activated any SAP BW Web Applications in SAP BW 3.x, or if you have used charts in SAP BW 2.x web reports, you will need a windows version of IGS 6.40 (Internet Graphics Service) to execute the IGS chart migration after the upgrade.
If ESRI GIS software is in use, a different version of ESRI software maybe required for BI 7.0. (ArcView 8.2?).
If you plan to use Information Broadcasting, please review the requirement for additional infrastructure components such as EP, KMC, Workbook pre-calculation service, and Web AS connectivity to your mail servers.
Detailed information is available in the SAP NetWeaver u201904 master planning guide (http://service.sap.com/instguides -> SAP NetWeaver).
Basis
Test and distribute new SAP BW Frontend
Install and test the new BI 7.0 Frontend (including the new version of SAPGUI for Windows if applicable).
A detailed FAQ on the new BI 7.0 Frontend is available on the SAP service marketplace alias BWFAQ (http://service.sap.com/BWFAQ).
After successful testing, the new SAPGUI for Windows and SAP BW Frontend can be distributed to the BW teams and end users.
Basis
Alpha Conversion:
Ensure that your InfoObject data is consistent from a u201Cconversionu201D perspective (Alpha Converter tool) Check that you have executed the Alpha Converter tool to check the consistency of your InfoObject definitions and data for InfoObjects that utilize the ALPHA, NUMCV and GJAHR conversion exits.
Note: The Alpha conversion is not part of the SAP BW upgrade itself, but the upgrade simply checks to ensure you have successfully executed the check tool.
Transaction RSMDCNVEXIT
Check the system status:
u201CAll Characteristics Have Correct Internal Valuesu201D: The Alpha converter has been successful executed. The upgrade preparation can continue.
u201CNo Check yet/Inconsistent Internal Vales existu201D:
The Alpha converter check has not been executed.
u201CCharacteristics have Inconsistent Internal Valuesu201D:
The Alpha converter tool check has been executed and data problems have been detected. The InfoObject and data must be processed before the upgrade can be started.
BW
Upgrade SAP Note updates Check for newer versions of your SAP notes for the Upgrade.
Tip: The SAP service marketplace offers an option to subscribe to OSS notes so you can be notified of changes when you log on.
Basis/BW
Confirm SAP BW support package, kernel and DB/OS configuration Analyze current Support Package and DB/OS/Kernel configurations in your SAP BW landscape in relation to the SAP BW 3.x upgrade requirements.
Apply necessary support packages, kernel patches, and DB and OS patches to meet upgrade requirements
Basis
Alignment of SAP BW objects within your SAP BW system landscape Check and, where required, re-align SAP BW Objects and developments in your SAP BW system landscape (Development, Quality Assurance and Production).
SAP BW Object differences can impact the quality of testing in the Development and Test environment and can lead to change management issues.
This check is to minimize risk and ensure productive objects are being tested prior to the Production upgrade.
Where alignment issues exist and realignment is not possible, alternative testing plans should be devised.
Basis/BW
Confirm all developments are deployed. Ensure that all SAP BW developments are deployed or they are to be re-developed/tested after the upgrade.
In the DEV system, all SAP BW development transports should be released (i.e. transport created and released) and imported to all downstream systems (i.e. QAS and PRD systems).
For SAP BW developments not already collected in the transport collector, a decision must be made:
Deploy the developments or wait until the upgrade has completed to deploy.
o Development to be deployed should be collected, released, and imported into the QAS and PRD systems.
o Developments that should be deployed after the upgrade should be re-tested/re-developed after the upgrade.
In the QAS or PRD systems, ensure that all SAP BW development transports have been imported prior to the upgrade.
BW
Implement BI 7.0 Business Explorer Frontend Install, evaluate, test and distribute the new BI 7.0 Business Explorer Frontend.
Basis/BW
Pre-upgrade Process:
Download required BI 7.0 support package Stack for inclusion in the upgrade Determine the equivalent support package level of the source SAP BW release and the target SAP BW release.
There is a minimum requirement that you upgrade to at least the equivalent support package level on the target SAP BW release so that you do not lose functionality, corrections, and data.
It is recommended to upgrade to the latest version of all support packages during the upgrade via the upgradeu2019s support package binding functionality.
BI 7.0 Support Packages are delivered via SAP NetWeaver u201904 Support Package stacks (SP-Stacks). It is not recommended to partially apply some of the SP-Stacksu2019 individual support packages. You should apply all of the SP-Stacks support packages at once.
For more information on the SP-Stacks and SAP NetWeaver SP-Stacks, please see the SAP service marketplace alias SP-Stacks (http://service.sap.com/sp-stacks)
You should also review, download, and bind in support packages for all add-on components that are installed on SAP BW and will be upgraded during the SAP BW upgrade (e.g. SEM-BW, ST-PI, etc)
Basis
Apply latest Support Package tool patch Apply latest SPAM patch before executing PREPARE Basis
Validate the SAP BW (ABAP) Data Dictionary and the Database Data Dictionary for consistency Check Database consistency
Transaction DB02:
o Execute ABAP SAP_UPDATE_DBDIFF and re-execute DB02 check. This gives a truer view of the SAP BW objects in DB02.
o Check missing database objects (indices, tables, etc)
o Missing indices may identify erred data loads or process problems
Tip: Missing indices on InfoCubes can be restored by RSRV or ABAP SAP_INFOCUBE_INDEXES_REPAIR
Note: check for running data loads before executing a repair!
o Check DDIC/DB consistency
Verify database objects and consistency
(e.g. SAPDBA check for offline data files)
BW
Remove unnecessary SAP BW temporary database objects Delete all SAP BW temporary database objects:
Execute routine housekeeping ABAP SAP_DROP_TMPTABLES.
This reduces the numbers of database objects that need to be copied during the upgrade.
Note: take care not to delete objects that are in use as this will cause queries, compressions, etc to terminate.
BW
Validate your SAP BW Objects for correctness prior to your upgrade Using the SAP BW Analysis Tool (transaction RSRV), perform extensive tests on all important SAP BW Objects to ensure their correctness prior to the upgrade.
Note: this test should be repeatable so you can re-validate after the upgrade!
Ensure that any inconsistencies are identified and corrected
RSRV has a number of extensive tests and if all checks are executed will consume a large amount of time. Multiple tests can be performed in parallel.
Tip: Some corrections in development can be deployed to other systems via transport in advance of the next upgrade.
BW
Ensure DB Statistics are up to date prior to the upgrade Check DB statistics for all tables.
Tables without statistics, especially system tables, can seriously impact upgrade runtimes.
Check DB statistics for missing Indexes for InfoCubes and Aggregates
o User transaction RSRV to check BW
Check SAP BW Support Package status Check the status of all support packages (via transaction SPAM)
Ensure the Support Package queue is empty
Confirm all applied Support Packages Basis
Check all u2018Repairsu2019 Check for unreleased repair transports
Release all unreleased transports
In your QAS and PRD system, check if all repair transports have been imported (i.e. systems are aligned)
Import missing repair transports into down stream systems. This will avoid differing message and/or errors during the upgrade.
BW
Check InfoObject status Check for revised (modified) InfoObjects that have not been activated.
All InfoObjects should be active or saved (not activate):
o Check all inactive InfoObjects:
Transaction RSD1 (Edit InfoObjects),
click on u201CAll InfoObjectsu201D radio button and click the u201CDisplayu201D button.
Modified InfoObjects are denoted by yellow triangles!
o Determine if revision should be activated or removed.
'Reorgu2019 or u2018Repairu2019 all InfoObjects
This checks and repairs any discrepancies in the InfoObject definition and structures. It is common to have obsolete DDIC and table entries for InfoObjects after multiple upgrades and definition changes. These obsolete entries normally do not effect normal SAP BW operations.
Transaction RSD1 (Edit InfoObjects), Select u201CExecute Repairu201D or u201CExecute Reorgu201D Use expert mode for selective executions.
BW
All ODS data loads must be activated. Activate all inactivated ODS Object requests.
All ODS u2018Mu2019 tables must be emptied prior to the upgrade as a new activate process is implemented
o Inactivated ODS request can be located via the Admin workbench -> u2018Monitoringu201D button -> u2018ODS Status Overviewu201D
BW
All Transfer and Update rules should be active Check for inactive Update and Transfer Rules
o All update rules and transfer rules should be active or deleted.
o Look into the table RSUPDINFO for update rules and search for the version "not equal" to "A". Likewise use the table RSTS for Transfer rules/structure. BW
All InfoCubes should be active Check for inactive InfoCubes and Aggregates (Aggregates are InfoCubes too!)
o All InfoCubes should be activated or deleted.
o Execute ABAP RSUPGRCHECK to locate any inactive InfoCubes. See SAP note 449160.
BW
All Web Report objects should be consistent prior the upgrade. Check the consistency of your SAP BW web objects (web reports, web templates, URLs, roles, etc). All objects should be consistent prior to web object conversion after the upgrade. It is recommended to ensure consistency before the upgrade.
o For Original release SAP BW 3.x:
A SAP BW web reporting objects check can be executed via a new check in RSRV. This is provided via a SAP BW support package.
Please see SAP note 484519 for details.
BW
Backup your system before starting PREPARE Before execution PREPARE, perform a full database backup (including File system). Ensure you can recover to the point in time before PREPARE was executed.
Database admin
Address any instructions/errors generated by PREPARE Address any issues listed in log files Checks. Log generated by PREPARE.
o Repeat PREPARE until all checks are successful. Basis
Complete any Logistic V3 data extractions and suspend V3 collection processes Extract and empty Logistics V3 extractor queues on SAP R/3 source systems.
o The V3 extraction delta queues must be emptied prior to the upgrade to avoid any possible data loss. V3 collector jobs should be suspended for the duration of the upgrade.
They can be rescheduled after re-activation of the source systems upon completion of the upgrade.
Note: If you perform any data loads after executing PREPARE, re-check the status of all delta queues in SAP BW and the source systems(s).
BW
Complete any data mart data extractions and suspend any data mart extractors Load and Empty all Data mart Delta Queues in SAP BW. (e.g. for all export DataSources)
o The SAP BW Service SAPI, which is used for internal and u2018BW to BWu2019 data mart extraction, is upgraded during the SAP BW upgrade. Therefore, the delta queues must be emptied prior to the upgrade to avoid any possibility of data loss.
Note: If you perform any data loads after executing PREPARE, re-check the status of all delta queues in SAP BW and the source systems(s).
BW
Check that your customer defined data class definitions conform to SAP standards Check all customer created Data classes used by SAP BW Objects (i.e. InfoCubes, ODS Objects, Aggregates, InfoObjects, and PSAs) to ensure they conform to SAP standards.
o Check your data class definitions as detailed in SAP Notes 46272 and 500252.
o Incorrect data classes could create activation errors during the upgrade.
BW
Remove unnecessary SAP BW temporary database objects Delete all SAP BW temporary database objects:
Execute routine housekeeping ABAP SAP_DROP_TMPTABLES.
For more information see SAP note 308533 (2.x) and 449891 (3.x).
This reduces the numbers of database objects that need to be copied during the upgrade.
Note: take care not to delete objects that are in use as this will cause queries, compressions, etc to terminate.
Basis/BW
Backups! Before executing the upgrade, ensure that you have a backup strategy in place so you can return to the point where loading was completed and the upgrade started.
Ensuring you can return to a consistent point in time (without having to handle rollback or repeats of data loads) is key to having a successful fallback plan.
Database Admin
Before Execution:
All SAP BW administration tasks should have ceased Cease all SAP BW administration tasks such as Object maintenance, query/web template maintenance, data loads, transports, etc at the beginning of the upgrade.
The Administrators Workbench and the Data Dictionary are locked in the early phases of the upgrade.
Reminder: Users can execute queries until the time that the upgrade determines that the SAP BW System should be closed*
- timing depends on the type of upgrade selected
Basis/Admin
Remove unnecessary SAP BW temporary database objects Repeat the deletion of all SAP BW temporary database objects after you have stopped using the SAP BW Admin workbench*
Execute routine housekeeping ABAP SAP_DROP_TMPTABLES.
For more information see SAP note 308533 (2.x) and 449891 (3.x).
- timing depends on the type of upgrade selected
BW
Check system parameters Check OS, DB, and Instance profile parameters.
Check System Instance parameters for new BI 7.0 specific parameters. See SAP note 192658 for details
Check for any DB specific parameters for BI 7.0
Check for any new OS parameters Basis
Check Database archiving mode Turn database archive log mode back on if it was disabled during the upgrade!
Database Admin
After Execution:
Check the systemu2019s installation consistency Execute Transaction SICK to check installation consistency
BW
Check the system logs Perform a technical systems check.
Example: Check system and all dispatcher logs (inc. ICM logs)
Basis
Apply latest executable binaries Apply the latest 6.40 Basis Kernel for all executables
Tip: use the SAP NetWeaver u201904 SP-Stack selection tool to find all binaries. (http://service.sap.com/swdc)
Basis
Review BI 7.0 Support Packages for follow-up actions. Review SAP Notes for all SAP BW Support packages applied during (bound into the upgrade) and applied after the upgrade:
o Search for Note with the keyword u201CBWu201D, u201CSAPBWNEWSu201D, and u201C<BW release>u201D
o Follow any required instructions identified in the SAP Notes
Basis
Apply latest patches Apply the latest SPAM patch
Apply any required support packages that were not bound into the upgrade.
Basis
Apply additional BI 7.0 Support Packages
(if required) SAP recommends that customer remain current on SAP Support Packages.
Review SAP Notes for all SAP BW Support packages applied in previous task.
o Search for Notes with the keyword u201CBWu201D, u201CSAPBWNEWSu201D, and u201C<BW release>u201D
o Follow any required instructions identified in the SAP Notes
Basis
Resolve any modified SAP delivered Role issues If SAP delivered Roles were modified, then these modifications may incorrectly appear in the upgrade modification adjustment tool (SPAU).
Review and implement SAP note 569128 as required Basis
Configuring Information Broadcasting EP/KMC Connections
If you plan to use the EP integration functionality of Information broadcasting (Broadcast to the EPu2019s PCD, Broadcast to KMC, or Broadcast to Collaboration Rooms):
Ensure the SAP EP is at the same SP-Stack level as your SAP BW system.
Follow the online help documentation to configure and connect the SAP BW system and the SAP EP system. (http://help.sap.com).
For broadcasting to KMC, ensure that KM has the u2018BEx Portfoliou2019 content available.
Basis
Re-check SAP BW Object and consistency Execute RSRV to check SAP BW Object consistency
Repeat tests that we executed prior to the upgrade.
Validate results BW
Check InfoCube views for consistency Check consistency of InfoCube fact table views
It is possible that fact table view /BIC/V<InfoCube>F is missing if a number of SAP BW upgrades have been performed before. See SAP Note 525988 for instructions for the check and repair program.
BW
Perform SAP BW Plug-in (SAPI) upgrade follow-up tasks If required, Re-activate the SAP BW u201CMyselfu201D source system in SAP BW.
The SAP BW internal plug-in (SAPI), which is used for internal data mart extraction and u2018BW to BWu2019 communication, is upgraded during the SAP BW upgrade. The source system is de-activated to prevent extractions and loading during the upgrade.
It may be required to replicate export DataSources and reactivate transfer structures/rules for internal data loads (i.e. ODS Object to InfoCube objects).
o Tip: It is advised to do this step for all export DataSources to avoid possible errors during execution of InfoPackages
Check that all other Source Systems are active.
o Activate as required. Basis/BW
Check SAP BW Personalization is implemented (SAP BW 2.X upgrades will have performed this in the previous task).
Validate that personalization has been activated in your SAP BW system.
Note: It has been observed that in some cases, BEx Personalization has to be re-activated after an upgrade from SAP BW 3.x to BI 7.0. It is advised to check the status of personalization after the upgrade.
Enter the IMG (transaction SPRO), select SAP Business Warehouse -> Reporting relevant settings -> General Reporting Settings -> Activate Personalization in BEx
Check the status of the Personalization settings. All entries should be active u2013 highlighted by an unchecked check box.
To activate highlighted Personalization, click Execute. BW
For SAP BW 2.0B/2.1C -> BI 7.0 Upgrades:
Convert ODS secondary indexes to new standard For SAP BW 2.0B/2.1C -> BI 7.0 Upgrades:
Convert any customer created ODS Object secondary indexes to the new ODS Object index maintenance process.
Re-create all indexes in the ODS Object definition screen in transaction RSA1.
ODS indexes must conform to the new naming convention
BW
Converting IGS chart settings
Convert you existing IGS chart settings (converts IGS chart settings from BLOB to new XML storage format)
Ensure you have the latest SAP Web AS 6.40 IGS (stand alone windows version) installed and working
o Test via transaction RSRT
Execute the conversion process as directed the BI 7.0 upgrade guide.
Note: This step is required for all BI 7.0 upgrades Basis
Backup your SAP BW system Perform a full database backup (including the File system)
Remember to adjust your backup scripts to include new components such as the J2EE engine, pre-calculation service, etc.
Database Admin -
Object type RSDS not supported in BAPI source system
Hi
We have a problem when trying to create Data Source in NW2004s with 3rd party tool 'Ascential Datastage' as source system.The error message is "DataSource XXX (DS-DEV): Object type RSDS not supported in BAPI source system".
Is there any note or patch avaialable to be implemented to resolve this problem.Our support package is 07.
Regards
D.Ellora
Message was edited by:
Ellora DobbalaHi Ellora,
The latest BI release in SAP NetWeaver 7.0 (2004s) includes a new type of DataSource for which staging BAPI has not yet been aligned.
Third party ETL tool vendors can therefore only implement their load processes using the original type of DataSource from BW 3.x. This DataSource is still provided in the new release without any changes. Upgrading to NetWeaver 7.0 (2004s) BI (BI 7.0) does not therefore endanger existing implementations based on BW 3.x and NetWeaver 2004 (BW 3.5).
It is also possible to migrate third party ETL implementations based on BW 3.x into the new data-flow concept of a NetWeaver 7.0 (2004s) BI environment using new DataSources, transformations (also new), and data transfer processes (DTP) (also new). This is possible because the system provides an emulated view for BW 3.x DataSources, which makes it possible to combine them with NetWeaver 7.0 (2004s) transformations and DTPs for regular batch load processing (this does not include direct access and real-time data acquisition (RDA)).
This allows the benefits of the new data loading concept to be made available in such third party ETL-based loading scenarios.
Regards,
Anil -
Maxl in unix - Essbase 7.x
I'm trying to spawn a maxl through shell script in unix and I'm able to do so. However, if I try to call the same shell script in an ETL tool (Datastage) I get the following error when it tries to login to the server:
ERROR - 103 - Failed to initialize EssAPI. EssInit returned 1030018.
I have checked my .profile and it is set correctly. Explains why I'm able to call the same shell script and maxl from unixThis is my .profile:
LDR_CNTRL=`MAXDATA=0x30000000`
export LDR_CNTRL
export ESSLANG=English_UnitedStates.Latin1@Binary
export HYPERION_HOME=/home/dsadm/Hyperion
export ARBORPATH=/home/dsadm/Hyperion/Essbase
PATH=/usr/bin:/etc:/usr/sbin:/usr/ucb:$HOME/bin:/usr/bin/X11:/sbin:/usr/java131/
jre/bin:${HYPERION_HOME}:${HYPERION_HOME}/Essbase/bin
export PATH
if [ -s "$MAIL" ] # This is at Shell startup. In normal
then echo "$MAILMSG" # operation, the Shell checks
fi # periodically.
. /home/dsadm/Ascential/DataStage/DSEngine/dsenv
export LIBPATH=$ARBORPATH/bin:$LIBPATH
PS1=`echo "\n[ "`'$PWD'`echo " ]\n> "`
alias ll='ls -l'
set -o vi
export PROJ=/home/dsadm/Ascential/DataStage/Projects
export HOSTNAME=`uname -n`
export DSBIN=/home/dsadm/Ascential/DataStage/DSEngine/bin
export CGI=/home/dsadm/tomcat/webapps/ROOT/WEB-INF/cgi/ -
Oracle Legacy System to SAP Data Migration
Hi Experts,
New to data migration:
Can you guide me in how oracle staging is useful for data migration:
Here is my few doubts:
1. What is Oracle Staging?
2. How Oracle staging is useful for data migration?
3. I see few ETL tools for data migration such as Informatica, Ascential Datastage etc. but our requirement is how can we use oracle staging for data migration?
4. What are the benefits in using oracle staging for data migration?
Expecting your response of above queries.
Thanks,
--KishoreHere is my few doubts:
1. What is Oracle Staging?It is where ODI creates temporary tables. It does the transformation and if required cleans the data aswell.
2. How Oracle staging is useful for data migration?ODI loads source data into temporary tables(staging) and applying all the required mappings, staging filters, joins and constraints. The staging area is a separate area in the RDBMS (a user/database) where Oracle Data Integrator creates its temporary objects and executes some of the rules (mapping, joins, final filters, aggregations etc.). When performing the operations this way, Oracle Data Integrator behaves like an E-LT as it first extracts and loads the temporary tables and then finishes the transformations in the target RDBMS.
3. I see few ETL tools for data migration such as Informatica, Ascential Datastage etc. but our requirement is how can we use oracle staging for data migration?
4. What are the benefits in using oracle staging for data migration?You can refer https://blogs.oracle.com/dataintegration/entry/designing_and_loading_your_own
http://docs.oracle.com/cd/E21764_01/integrate.1111/e12643/intro.htm#autoId10
Expecting your response of above queries.
Thanks,
--Kishore -
FTP over SSL connectivity in File Adapter
Hi All,
I request your suggestion on my problem. I have a scenario idoc to file where I am connecting to my vendor server throught SFTP (Ftp over SSL). In this my vendor specifically told that to obtain secure FTP connectivity to their server they require a pre-approved Secure FTP client be used to access the service.
So as per this requirement first our XI server need to coneect to the pre-approved client and the connectivity will happen to the vender server. He list the pre-approved client as below
*Cleo Lexicom 2.1
*TrailBlazer ZMOD FTP Client V3R1 PTF Level PFT3100034
*QualEDI for Windows, 32-bit version
*Ascential DataStage TX, Release 7.5
*Future 3 - Advanced Communication Module Plus (ACM Plus)
*eBridge FTPS Communicator for GXS version 5.3
*Ipswitch Inc's WS_FTP Professional version 8.02.
·Robo-FTP version 3.2
Please let me know will this be possible from our file adapter. Currently as per this requirement we open up the port of XI server for SFTP connecvity but through this we can have host to host connection over SFTP and not sure whether we can connect to client software and from their to vendor sever.
Kindly needful your suggestion/solution on this.
Regards,
DhillHi,
Thank you, Yes I have used FTPS only please find the below details given in the communication channel.
<b>FTP Connection Parameters</b>
Server: ServerName
Port : 6366 (specified by vendor)
Data connection : Passive
Timeout(secs) : 65
Connection Security: FTPS (FTP Using SSL/TLS) for Control and Data Connection
Command Order: AUTH TLS, USER, PASS, PBSZ, PROT
Keystore: service_ssl
X-509 Certificate and Private Key: ssl-credentials
User Name : Vendor user name
Password: Vendor given password
Connect Mode: Permanantly
Transfer Mode: Text
Maximum Concurrency: 1
and also as per he list given by vendeor we can use *Ipswitch Inc's WS_FTP Professional version 8.02.
<b>Note:</b> We have Deploying the SAP Java Cryptographic Toolkit and also CA certificate used to sign the server certificate added to the TrustedCAs keystore view.
So If possible i request you to kindly provide the details how we need to specify the client software between our XI server and Vender server as you mentioned in your solution.
Please let me know your mail id, i will forward the screenshot of my communication channel.
Kindly appreciate your help on this.
Regards,
Dhill. -
How would you do this process?
Hi,
I have an interesting scenario that I am not sure how to set up correctly in Oracle. Please let me know your thoughts if possible.
We have a product (part # 1 bathroom vanity = part # 1cabinet supplier #1, part # 2 granite counter supplier #2, part #3 faucet supplier #3, part # 4drain supplier # 3) we drop ship from an overseas supplier direct to the port. The sales orders are all drop ship for part #1. We issue purchase orders for parts 2 – 4 and have them drop shipped to supplier number 1 for aggregation in one shipment and box.
Questions…
How do we link recommendation drop ship orders for part 1 to prompt demand for parts 2 – 4? These would all need to be drop ship as well since they are never received here.
How do we allocate costs to the single highest level item for all costs?
Is there a way to set up a drop ship assembly?
Thanks,
CCHello Niten,
You can touch on few points like:
1>Possible Source sytems involved: Flat files, R/3, XML, Legacy applications, Applications like People soft etc.
2> Extractors: Standard extractors for R/3, Procedure of XML extraction, Tools like DBconnect for databases like SQL, Oracle, tools like Ascential datastage for applications like peoplesoft etc.
3>Steps involved in extraction, terminologies like datasource, Update modes, Record modes, Infosources, Transfer Rules, Update rules, PSA, Master data, Data targets like Cubes, ODS, difference between them.
4>Different scenarios with examples. Granularity of data, History of data to be covered, Application components within R/3 those are involved and the respective approaches for the same. for eg: LO for logistics, Co-pa, FI-SL, Generic extractions.
5> Features like Monitoring, Scheduling of data loads.
Hope this helps..
thanks, -
Oracle 9.2.0.1.0 to 10g Upgradation testing
Hi All,
We are upgrading our database from Oracle 9i to 10g.
Our development environment is Ascential Datastage ETL Tool
Our project has been already developed by some other team and now only database up gradation is happening.
Need some clarifications and solutions
1) How we can get to know all these objects are properly upgraded
2) What can be the best testing strategy for testing all these database objects like tables, functions, procedures , views, sequences etc. If any similar document has been prepared by any of the projects, please send to my mail id
3) Is there anything we need to do change in any of the objects to make compatible with 10g. like any functions that has to be replaced or datatype mismatching etc …..
4) The type of testing needs to be done for different database objects.Anyone familiar with the way English is spoken in America, excluding one small enclave in North Carolina perhaps, should understand a couple of things:
1. I understand a bit of Hindi so I understood some of the background conversation while on the phone with the spammers.
2. The individuals speaking to me were as dumb as a bag of hammers.
Now according to Wikipedia the planet has about 370 million people who speak Hindi as a first language and another 580 who speak it as a second language. 950 million people minus two seems a reasonable ratio.
Anyone looking too hard to find an insult can likely find it looking at their own reflection in the mirror. Grow up and get over it folks.
BTW: Don ... you are the one that referred people to your website claiming it contained the list that you use. Taking you at face value it appears that your list does not include doing a backup. Perhaps you are so expert in the practice that you "understand" that but if you are going to refer novices to your site you need to offer them a complete list along with the ubiquitous advertising.
Anyone wanting my list can get it for free ... here it is:
1. Read the relevant Oracle documents
2. Go to metalink and research any problems others have had with the procedure
3. Create an RDA, zip it, and have it ready to upload to metalink
4. Perform a level 0 backup using RMAN
5. Verify the backup is valid and can be used for a restore
6. Start flashback logging if not already running
7. Create a guaranteed restore point
8. Follow the directions at Step 1 except as modified by docs read in Step 2.
Not one of those steps is in your list Don? And, quite frankly, I think people would be ill-advised to consider steps on what to do for "SLOW PERFORMANCE AFTER UPGRADE" any guidance on performing the upgrade itself.
But heck ... anytime you can send people to a page full of advertisements why miss the chance.
People interested in tuning would be well advised to go to one of these URLs:
http://www.jlcomp.demon.co.uk/faq/ind_faq.html#Performance_and_Tuning
http://www.hotsos.com/oop.html -
Urgent: : Error in DataStage
we are extracting the Data using DataStage ASCENTIAL which is responsible for sending the data upto PSA.
The scenario is that, all the data records got loaded into the PSA ,but its still showing error in data request.
On detailed analysis, we found that for <b>'Data request sent off?'</b> and <b>'Processing error in warehouse reported?'</b> the status is red.
If we make the data request green manually , we are able to load the data into data targets.
Please provide some pointers towards the same.hi,
it's not possible to load data. i am also faced the same kind of error during extraction. what i done is i deleted the data source, info source , update rules everything except info cube in BW. after that i went to R/3 side transfered the data source to BW side.
replicated in BW side. assigned the info source. done the following steps in BW. after this it's works. check it out.
if helpful provide points
regards
harikrishna N -
Peoplesoft EPM 9.1 - Error creating a Datastage project
Hi All ,
I am trying to create a new Datastage project (ETL) for Peoplesoft 9.1 in Datastage 8.5 version , but keep getting this error -
Error message:
DSR.ADMIN: Error creating a schema
DataStage/SQL: "****/****" is not an SQL user.
****/**** is one of the username which we use to login to Datastage.
Any insights ?
Thanks
DkNeed to increase database (where HFM Schemas are created) processes and sessions. once after increase you should be able to open the application without any issues.
Looks like newly redesigned hfm v11.1.2.2 need more db sessions & processes. -
SQL Synatx error while creating table dynamically
I am getting error when I am trying to create table on runtime
Declare @FileName varchar(100)
Declare @File varchar(100)
set @FileName='brkrte_121227102828'
SET @File = SUBSTRING(@FileName,1,CHARINDEX('_',@FileName)-1)
--=select @File
Declare @table_name varchar(100)
Declare @ssql varchar(1000)
SET @table_name = 'DataStaging.dbo.Staging_'+ @File
SET @sSQL = 'CREATE TABLE ' + @table_name + ' ( ' +
' [COL001] VARCHAR (4000) NOT NULL, ' +
' [Id] Int Identity(1,1), ' +
' [LoadDate] datetime default getdate() ' +
Exec @sSQL
Error massage:-
Msg 203, Level 16, State 2, Line 16
The name 'CREATE TABLE DataStaging.dbo.Staging_brkrte ( [COL001] VARCHAR (4000) NOT NULL, [Id] Int Identity(1,1), [LoadDate]
datetime default getdate() )' is not a valid identifier.
Please help me to resolve above error
Thankx & regards, Vipin jha MCPGot the answer
issue with the Exec (@sSQL)
thanks
Thankx & regards, Vipin jha MCP
Maybe you are looking for
-
Problems with site displaying in different browsers
Hi, I have made this site in dreamweaver CS3 using one of the default templates when you make a new documents. On Safari and Firefox (and I think on recent IE browsers as well) it looks fine, however on IE6 and IE7 it messes up completely, and the co
-
Dreamweaver for Mac or Windows?
Can I get some advice? I'm trying to decide whether to get Adobe Creative Suite 3 for Mac, or for Windows. What are the pros/cons of Dreamweaver, Mac version, versus Dreamweaver, Windows version? Has anyone used both? If so, which one do you prefer?
-
Update multiple rows using CASE WHEN
I have the table ACCOUNT of structure as follow: ACCOUNT_ID ACCOUNT_STATUS 004460721 2 042056291 5 601272065 3 I need to update the three rows at once using one SELECT statement such that, the second column will be 5, 3, 2 respectively. I used the f
-
Hi, I have a bean object and I have associate it a java class. When I run the form, under 9iAS, the class don't work. I put the class in the same directory of the form. Must I set this path for the class.?? What Is this file ?? Thanks
-
How to configure idocs to Syndicate masterdata objects from MDM to ECC
Hi Experts, How can we configure the IDOC to syndicate the master data objects from MDM to ECC. can anyone explain step by step procedure to achieve the same? Thanks in advance SS