Wrong table is displayed in session log(10.1.3.4.1)
Hi
i have a interesting scenario in apps implementation.scenario is as follows
preconditions: i enabled cache in nqsconfig.ini file
i got the correct results in obiee answers.but when i look in session log i found that physical query itself is going to wrong database(may be it
displayed the query earlier,session log is not updated.i click on close cursors,cleared cache,still same problem exists,some times i did not got session log( log not displayed)(evenif i put loglevel as 2 or 3 in rpd).anyone has some input
i restarted servers also,but no use
regards
mervin
Hi Mervin, Previously i had same problem.
Someboby previously used Loglevel 5 or more on that Database. thats why still it is showing that queries only.
My suggestion is once again clear the cache and Increase the Loglevel 5 or more and run the queries once again.
I was also done the same steps when i was facing the same error.
Thanks
Sree
Edited by: GSR on Oct 27, 2010 11:52 AM
Similar Messages
-
Session log from SM35-not fully displayed
Dear Freinds,
My requirement is to read the log of SM35, that is my intenstion is after my report being executed (BDC session method ) the session get processed in background only as i have used rsbdcsub .
Now i want to display the log which find in the SM35 for processed session i have tried below in my program
i.e after running the session I should read log and store for which legacy numbers SAP customer is created and for which its failed.
As specified in one of the logs in the SDN i have written as below.
Get your TEMSEID MANDANT from APQL table, and call these four function modules in sequence.
DATA: CHARCP LIKE RSTSTYPE-CHARCO VALUE '0000'.
CALL FUNCTION 'RSTS_GET_ATTRIBUTES'
EXPORTING
AUTHORITY = ' '
CLIENT = APQL-MANDANT
NAME = APQL-TEMSEID
IMPORTING
CHARCO = CHARCP
EXCEPTIONS
FB_ERROR = 1
FB_RSTS_OTHER = 2
NO_OBJECT = 3
NO_PERMISSION = 4
OTHERS = 5.
CALL FUNCTION 'RSTS_OPEN_RLC'
EXPORTING
NAME = APQL-TEMSEID
CLIENT = APQL-MANDANT
AUTHORITY = 'BATCH'
PROM = 'I'
RECTYP = 'VNL----'
CHARCO = CHARCP
EXCEPTIONS
FB_CALL_HANDLE = 4
FB_ERROR = 8
FB_RSTS_NOCONV = 12
FB_RSTS_OTHER = 16
NO_OBJECT = 20
OTHERS = 24.
IF SY-SUBRC > 0.
EXIT.
ENDIF.
CALL FUNCTION 'RSTS_READ'
TABLES
DATATAB = LOG_TABLE
EXCEPTIONS
FB_CALL_HANDLE = 4
FB_ERROR = 8
FB_RSTS_NOCONV = 12
FB_RSTS_OTHER = 16
OTHERS = 16.
IF SY-SUBRC > 0.
EXIT.
ENDIF.
CALL FUNCTION 'RSTS_CLOSE'
EXCEPTIONS
OTHERS = 4.
LOG_TABLE is having only the below
ENTERDATE LIKE BTCTLE-ENTERDATE,
ENTERTIME LIKE BTCTLE-ENTERTIME,
LOGMESSAGE(400) TYPE C,
In t´he log message I am not getting any exact message displayed in SM35 log . Please suggest me what i have to do
regards
divyabetter use:
CALL TRANSACTION tcode USING bdcdata MODE mod MESSAGES INTO itab.
and analyse sy-subrc and itab for every customer
A. -
How to log the messages in an internal table n display
Hello guys,
This is my first post in SDN.
I am uploading some data from application server. I am doing lot of varifications, based on that I have to display the messages.
So I asked not to use the write statements to display the messages but asked me to create an internal table and display the messages afterwards..
like I am using some BAPI to upload after the bapi call i have to log the result for each record into an internal table and show...
can anyone guide me how to do it?Thanks for your info Azad,
here I am posting my code... can u go through it once and lemme me whether i am in right track or not?
FORM FORM_POST_DATA.
IF X_POST = 'X'.
CALL FUNCTION 'BAPI_ACC_BILLING_POST'
EXPORTING
DOCUMENTHEADER = WA_DOCUMENTHEADER
CUSTOMERCPD = WA_CUSTOMERCPD
IMPORTING
OBJ_TYPE =
OBJ_KEY =
OBJ_SYS =
TABLES
ACCOUNTRECEIVABLE = IT_ACCOUNTRECEIVABLE
ACCOUNTGL = IT_ACCOUNTGL
ACCOUNTTAX = IT_ACCOUNTTAX
CRITERIA =
VALUEFIELD =
CURRENCYAMOUNT = IT_CURRENCYAMOUNT
RETURN = IT_RETURN
SALESORDER =
SALESAMOUNT =
EXTENSION1 = IT_EXTENSION1
LOOP AT IT_RETURN.
MOVE-CORRESPONDING IT_RETURN TO IT_MESSAGES.
ENDLOOP.
ENDIF.
WRITE: / 'Result of Post:'.
WRITE : / REF_TYPE,REF_KEY,REF_SYS.
PERFORM FORM_SHOW_MESSAGES.
ENDFORM. "FORM_POST_DATA
FORM FORM_SHOW_MESSAGES.
IF IT_RETURN[] IS INITIAL.
WRITE: / 'no messages'.
ELSE.
SKIP 1.
LOOP AT IT_RETURN.
WRITE: / IT_RETURN-TYPE,
(2) IT_RETURN-ID,
IT_RETURN-NUMBER,
(80) IT_RETURN-MESSAGE,
IT_RETURN-LOG_NO
IT_RETURN-LOG_MSG_NO
IT_RETURN-MESSAGE_V1
IT_RETURN-MESSAGE_V2
IT_RETURN-MESSAGE_V3
IT_RETURN-MESSAGE_V4
(20) IT_RETURN-PARAMETER,
(3) IT_RETURN-ROW,
IT_RETURN-FIELD.
IT_RETURN-SYSTEM
ENDLOOP.
ENDIF.
ULINE.
ENDFORM. " Show_messages
Nisha... -
In which table we can find the sessions log
Hi all,
I am using BDC using session method,I need to know the table in which session log is getting updated, one more thing could I able to find the session log for the past one year in SM35, plz help me out.
regards
paulhi see the below table for logs in session method.
BDCLM
regards, -
SQL Errors In Informatica Session Logs
I am running a new installation of Analytics 7.9.6. No customizations have been made, and I'm trying to load data from a vanilla "Vision" instance of EBS 11.5.10. I have successfully completed loading of the Financial Analytics data warehouse tables, and have set up a new Execution Plan to load the Supply Chain/Order Management Analytics data warehouse tables.
In loading the Supply Chain tables, however, Informatica jobs are failing for me and I'm not able to complete the initial load. In looking at the session logs, I'm discovering invalid SQL! It looks like malformed SQL is being generated (including fundamental errors like "FROM not expected"). When I try to take these SQL statements and run them SQL*Developer, I'm finding that there are indeed syntax errors. In one case, the SQL statement was that failed was a very simple Select/From/Where with correct syntax, however, the field name in the Where clause did not exist in my EBS instance. ( SDE_ORA_Project_HierarchyDimension_Stage2 - "SYS_PROGRAM_FLAG" field doesn't exist in the PA_PROJECTS_ALL source table )
Any SC-OM veterans out there with thoughts on why this would be going wrong???
Thank you!This seems to be related to a patchset M required on the EBS system. See the following Metalink note:
Tasks Related To Projects Table Failed During Full Load [ID 1300293.1]
Here are the details:
Cause
The Employee Expenses subject area in DAC includes several Project Dimension tasks and patchset M has not been applied to EBS 11.5.10
Solution
1). As documented in the Oracle Business Intelligence Applications Release Notes > section 1.3.30 Implementation of Oracle Project Analytics with Oracle eBusiness Suite 11.5.10:
Implementation of Oracle Project Analytics with Oracle eBusiness Suite 11.5.10
requires Family Pack M (11i.PJ_PF.M) to be applied to Oracle eBusiness Suite 11.5.10.
2). The customer does not use OBI Project Analytics as it is not used in eBusiness at the client site.
In this case, the recommendation is to disable the Project Dimension as documented in the Oracle Business Intelligence Applications Configuration Guide for Informatica PowerCenter Users > 6.2.1.1 How to Disable Project Dimensions -
Error from the session log between Informatica and SAP BI
HI friends,
I am working extraction from bi by Informatica 8.6.1.
now, I start the process Chain from bi, and I got a error from Informatica's session log.Pls help me to figure out what's going on during me execution.
Severity Timestamp Node Thread Message Code Message
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR TM_6228 Writing session output to log file [D:\Informatica\PowerCenter8.6.1\server\infa_shared\SessLogs\s_taorh.log].
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR TM_6014 Initializing session [s_taorh] at [Fri Dec 17 11:01:31 2010].
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR TM_6683 Repository Name: [RepService_dcinfa01]
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR TM_6684 Server Name: [IntService_dcinfa01]
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR TM_6686 Folder: [xzTraining]
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR TM_6685 Workflow: [wf_taorh] Run Instance Name: [] Run Id: [43]
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR TM_6101 Mapping name: m_taorh.
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR TM_6964 Date format for the Session is [MM/DD/YYYY HH24:MI:SS.US]
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR TM_6703 Session [s_taorh] is run by 32-bit Integration Service [node01_dcinfa01], version [8.6.1], build [1218].
INFO 2010-12-17 11:01:31 node01_dcinfa01 MANAGER PETL_24058 Running Partition Group [1].
INFO 2010-12-17 11:01:31 node01_dcinfa01 MANAGER PETL_24000 Parallel Pipeline Engine initializing.
INFO 2010-12-17 11:01:31 node01_dcinfa01 MANAGER PETL_24001 Parallel Pipeline Engine running.
INFO 2010-12-17 11:01:31 node01_dcinfa01 MANAGER PETL_24003 Initializing session run.
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING CMN_1569 Server Mode: [UNICODE]
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING CMN_1570 Server Code page: [MS Windows Simplified Chinese, superset of GB 2312-80, EUC encoding]
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING TM_6151 The session sort order is [Binary].
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING TM_6156 Using low precision processing.
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING TM_6180 Deadlock retry logic will not be implemented.
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING SDKS_38029 Loaded plug-in 300320: [PowerExchange for SAP BW - OHS reader plugin 8.6.1 build 183].
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING SDKS_38024 Plug-in 300320 initialization complete.
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING PCCL_97003 [WARNING] Real-time session is not enabled for source [AMGDSQ_IS_TAORH]. Real-time Flush Latency value must be 1 or higher for a session to run in real time.
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING SDKS_38016 Reader SDK plug-in intialization complete.
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING TM_6307 DTM error log disabled.
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING TE_7022 TShmWriter: Initialized
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING TM_6007 DTM initialized successfully for session [s_taorh]
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR PETL_24033 All DTM Connection Info: [<NONE>].
INFO 2010-12-17 11:01:31 node01_dcinfa01 MANAGER PETL_24004 PETL_24004 Starting pre-session tasks. : (Fri Dec 17 11:01:31 2010)
INFO 2010-12-17 11:01:31 node01_dcinfa01 MANAGER PETL_24027 PETL_24027 Pre-session task completed successfully. : (Fri Dec 17 11:01:31 2010)
INFO 2010-12-17 11:01:31 node01_dcinfa01 DIRECTOR PETL_24006 Starting data movement.
INFO 2010-12-17 11:01:31 node01_dcinfa01 MAPPING TM_6660 Total Buffer Pool size is 1219648 bytes and Block size is 65536 bytes.
INFO 2010-12-17 11:01:31 node01_dcinfa01 READER_1_1_1 OHS_99013 [INFO] Partition 0: Connecting to SAP system with DESTINATION = sapbw, USER = taorh, CLIENT = 800, LANGUAGE = en
INFO 2010-12-17 11:01:32 node01_dcinfa01 READER_1_1_1 OHS_99016 [INFO] Partition 0: BW extraction for Request ID [163] has started.
Edited by: bi_tao on Dec 18, 2010 11:46 AMINFO 2010-12-17 11:01:33 node01_dcinfa01 WRITER_1_*_1 WRT_8167 Start loading table [VENDOR] at: Fri Dec 17 11:01:32 2010
INFO 2010-12-17 11:01:33 node01_dcinfa01 WRITER_1_*_1 WRT_8168 End loading table [VENDOR] at: Fri Dec 17 11:01:32 2010
INFO 2010-12-17 11:01:33 node01_dcinfa01 WRITER_1_*_1 WRT_8141
Commit on end-of-data Fri Dec 17 11:01:32 2010
===================================================
WRT_8036 Target: VENDOR (Instance Name: [VENDOR])
WRT_8044 No data loaded for this target
INFO 2010-12-17 11:01:33 node01_dcinfa01 WRITER_1_*_1 WRT_8143
Commit at end of Load Order Group Fri Dec 17 11:01:32 2010
===================================================
WRT_8036 Target: VENDOR (Instance Name: [VENDOR])
WRT_8044 No data loaded for this target
INFO 2010-12-17 11:01:33 node01_dcinfa01 WRITER_1_*_1 WRT_8035 Load complete time: Fri Dec 17 11:01:32 2010
LOAD SUMMARY
============
WRT_8036 Target: VENDOR (Instance Name: [VENDOR])
WRT_8044 No data loaded for this target
INFO 2010-12-17 11:01:33 node01_dcinfa01 WRITER_1_*_1 WRT_8043 ****END LOAD SESSION****
INFO 2010-12-17 11:01:33 node01_dcinfa01 WRITER_1_*_1 WRT_8006 Writer run completed.
INFO 2010-12-17 11:01:33 node01_dcinfa01 MANAGER PETL_24031
RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
Thread [READER_1_1_1] created for [the read stage] of partition point [AMGDSQ_IS_TAORH] has completed. The total run time was insufficient for any meaningful statistics.
Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [AMGDSQ_IS_TAORH] has completed. The total run time was insufficient for any meaningful statistics.
Thread [WRITER_1_*_1] created for [the write stage] of partition point [VENDOR] has completed. The total run time was insufficient for any meaningful statistics.
INFO 2010-12-17 11:01:33 node01_dcinfa01 MANAGER PETL_24005 PETL_24005 Starting post-session tasks. : (Fri Dec 17 11:01:33 2010)
INFO 2010-12-17 11:01:33 node01_dcinfa01 MANAGER PETL_24029 PETL_24029 Post-session task completed successfully. : (Fri Dec 17 11:01:33 2010)
INFO 2010-12-17 11:01:33 node01_dcinfa01 MAPPING SDKS_38025 Plug-in 300320 deinitialized and unloaded with status [-1].
INFO 2010-12-17 11:01:33 node01_dcinfa01 MAPPING SDKS_38018 Reader SDK plug-ins deinitialized with status [-1].
INFO 2010-12-17 11:01:33 node01_dcinfa01 MAPPING TM_6018 The session completed with [0] row transformation errors.
INFO 2010-12-17 11:01:33 node01_dcinfa01 MANAGER PETL_24002 Parallel Pipeline Engine finished.
INFO 2010-12-17 11:01:33 node01_dcinfa01 DIRECTOR PETL_24013 Session run completed with failure.
INFO 2010-12-17 11:01:34 node01_dcinfa01 DIRECTOR TM_6022
SESSION LOAD SUMMARY
================================================
INFO 2010-12-17 11:01:34 node01_dcinfa01 DIRECTOR TM_6252 Source Load Summary.
INFO 2010-12-17 11:01:34 node01_dcinfa01 DIRECTOR CMN_1537 Table: [AMGDSQ_IS_TAORH] (Instance Name: [AMGDSQ_IS_TAORH]) with group id[1] with view name [Group1]
Rows Output [0], Rows Affected [0], Rows Applied [0], Rows Rejected[0]
INFO 2010-12-17 11:01:34 node01_dcinfa01 DIRECTOR TM_6253 Target Load Summary.
INFO 2010-12-17 11:01:34 node01_dcinfa01 DIRECTOR CMN_1740 Table: [VENDOR] (Instance Name: [VENDOR])
Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
INFO 2010-12-17 11:01:34 node01_dcinfa01 DIRECTOR TM_6023
===================================================
INFO 2010-12-17 11:01:34 node01_dcinfa01 DIRECTOR TM_6020 Session [s_taorh] completed at [Fri Dec 17 11:01:33 2010]. -
Bug 3.2.20.09 Build MAIN-09.87 - Using GUI to truncate - Wrong table
Here is what I am running into with this version of SQL developer. I can repeat this issue every time with these steps.
From Connections, open a connection, list tables.
Right click on table, select Privileges - Grant. Select the user ID, the actions & Apply. (you do not need to take these action, just go through the motions)
At this point it does not seem to matter if I stay within this connection or select a different connection.
Select a different table by clicking on it so it opens a new tab.
Select Actions - Table - Truncate.
At this point you would expect the pop-up to display the table you are trying to take action on, instead what happens is you are presented with the last table you took Privilege actions on. This will even open a closed connection to take these actions on.
So if you are not paying careful attention, you are about to truncate the wrong table!
I am also able to repeat this with 3.1.07.42 where you are presented with the last table you went through the motions on for Privileges. (you do not have to take any actions.
Edited by: user2794298 on Jan 31, 2013 2:55 PMNow all I need to know is when this gets corrected so I can download a updated version of SQL Developer to reduce the change of accidently truncating the wrong table again due to a software bug.
-
Why query against table(dbms_xplan.display) take too long?
the env is PROD, version is Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi.
Most of the time, when I do explain plan and query the table(dbms_xplan.display), it takes like 3-8 minutes to display the result.
So, I wonder what might go wrong here?
How a simple query would take this long?
Any thought or guess are welcome.
ThanksThat is not an option here. I only have query privilege on PROD.
All I can do is to figure out the reason for the slowness and advise the fix.
Thanks. -
Is there any Table to display all Tcodes used in BW
Hi
Is there any Table available where all Tcode can be seen.
Thanks,
Jaswanthahi,
these are the tcodes in bw..
RS00 Start menu
RS12 Overview of master data locks
RSA0 Content Settings Maintenance
RSA1 BW Administrator Workbench
RSA10 Realtime Test Interface Srce System
RSA11 Calling up AWB with the IC tree
RSA12 Calling up AWB with the IS tree
RSA13 Calling up AWB with the LG tree
RSA14 Calling up AWB with the IO tree
RSA15 Calling up AWB with the ODS tree
RSA1OLD BW Administrator Workbench (old)
RSA2 OLTP Metadata Repository
RSA3 Extractor Checker
RSA5 Install Business Content
RSA6 Maintain DataSources
RSA7 BW Delta Queue Monitor
RSA8 DataSource Repository
RSA9 Transfer Application Components
RSADMIN RSADMIN maintenance
RSADRTC70TOADR11 Conversion of table TC70 in ADR11
RSANWB Model the Analysis Process
RSANWB_CRM_ATTR Fill CRM Attributes
RSANWB_EXEC Execute Analysis Process
RSANWB_IMP Calculation of Importance
RSANWB_START_ALL Model the Analysis Process
RSANWB_SURVEY Analysis Process: Create Target Grp
RSAN_CLTV CLTV Modeling
RSAN_CLTV1 CLTV
RSAN_RESP Response Prediction Models
RSAN_RFM RFM Modeling
RSAN_SALES_PL_CALL Execute Sales Planning
RSAN_SURV_SHOW BW Survey
RSAN_SURV_TG BW Survey: Target Group Management
RSAN_VERI Analysis Process: Test Monitor
RSAN_WB_TST IMC Wrapper Transaction for Testing
RSARCH_ADMIN BW Archive Administration
RSARFCEX Variant for RSARFCEX
RSASSIBTCH Schedule Assistant in Background
RSATTR Attribute/Hierarchy Realignment Run
RSAWB New AWB
RSAWBSETTINGSDEL Delete user settings of the AWB
RSB0 Maintain OLAP authorization object
RSB1 Display authorization object
RSB2 Data Marts Generation Center
RSBBS Maintaining BW Sender-Receiver
RSBBS_WEB Transaction for the RRI in the Web
RSBCTMA_AC xCBL Action Codes
RSBCTMA_DT Mapping of Ext./Int. Document Type
RSBEB Business Explorer Browser
RSBMO2 Open Hub Monitor
RSBO Open Hub Maintenance
RSBOH1 Open Hub Maintenance
RSBOH2 Open Hub Maintenance
RSBOH3 Open Hub Maintenance
RSBO_EXTRACT Auth Check Open Hub Extraction
RSBROWSER BW Browser
RSBWREMOTE Create Warehouse User
RSCATTAWB CATT Admin. Workbench
RSCDS Summarization routine
RSCONCHA Channel conversion
RSCONFAV Favorites Conversion
RSCRMDEBUG Set Debug Options
RSCRMISQ Regis. of Infosets for Target Groups
RSCRMMDX Edit MDX
RSCRMMON Monitor Query Extracts
RSCRMSCEN Regist. Closed-Loop Scenarios
RSCRM_BAPI Test Program for RSCRM Interface
RSCRM_REPORT BW Queries with ODBO (to 2nd 0B)
RSCRT BW Monitor (Near)-Real-Time Loading
RSCR_MAINT_PUBLISH Maint. of Publishing Variables CR/CE
RSCR_MAINT_URL Maint. of URL Variables for CR/CE
RSCUSTA Maintain BW Settings
RSCUSTA2 ODS Settings
RSCUSTV1 BW Customizing - View 1
RSCUSTV10 BW Customizing - View 10
RSCUSTV11 BW Customizing - View 11
RSCUSTV12 Microsoft Analysis Services
RSCUSTV13 RRI Settings for Web Reporting
RSCUSTV14 OLAP: Cache Parameters
RSCUSTV15 BW Customizing - View 11
RSCUSTV16 BW Reporting
RSCUSTV17 Settings: Currency Translation
RSCUSTV18 DB Connect Settings
RSCUSTV19 InfoSet Settings
RSCUSTV2 BW Customizing - View 2
RSCUSTV3 BW Customizing - View 3
RSCUSTV4 BW Customizing - View 4
RSCUSTV5 BW Customizing - View 5
RSCUSTV6 BW Customizing - View 6
RSCUSTV7 BW Customizing - View 7
RSCUSTV8 BW Customizing - View 8
RSCUSTV9 BW Customizing - View 9
RSD1 Characteristic maintenance
RSD2 Maintenance of key figures
RSD3 Maintenance of units
RSD4 Maintenance of time characteristics
RSD5 Internal: Maint. of Techn. Chars
RSDBC DB connect
RSDB_ADD_ID_2_CRM Create External ID for CRM-GP
RSDB_INIT Initial Download of D&B Data
RSDCUBE Start: InfoCube editing
RSDCUBED Start: InfoCube editing
RSDCUBEM Start: InfoCube editing
RSDDV Maintaining Aggregates
RSDIOBC Start: InfoObject catalog editing
RSDIOBCD Start: InfoObject catalog editing
RSDIOBCM Start: InfoObject catalog editing
RSDL DB Connect - Test Program
RSDMD Master Data Maintenance w.Prev. Sel.
RSDMD_TEST Master Data Test
RSDMPRO Initial Screen: MultiProvider Proc.
RSDMPROD Initial Screen: MultiProvider Proc.
RSDMPROM Initial Screen: MultiProvider Proc.
RSDMWB Data Mining Workbench
RSDODS Initial Screen: ODS Object Processng
RSDODSD Initial Screen: ODS Proces. (Deliv.)
RSDPMDDBSETUP Creates a MOLAP Database in MSAS
RSDPMOLAPDS MOLAP DataSource creation
RSDPRFCDSETUP Create MOLAP Rfc Tests
RSDSD DataSource Documentation
RSDU_SHOWTEMPINCTAB RSDU_SHOWTEMPINCTAB
RSDV Validity Slice Maintenance
RSD_ACAT Maintain InfoObject catalog
RSEDIT Old editor
RSEIDOCM Variant for RSEIDOCM
RSENQ Display of Lock Log
RSEOUT00 Variant for RSEOUT00
RSFH Test Transaction Data Extractors
RSFLAT Flat MDX
RSFREQUPL Frequent upload from source systems
RSGWLST Accessible Gateways
RSH1 Edit hierarchy initial screen
RSH3 Simulate hierarchies
RSHIER Hierarchy maintenance w/o AdmWB
RSHIERINT Hierarchy maintenance from AdmWB
RSHIERSIM Simulate hierarchies
RSICUBE Maintain/Change InfoCubes (Internal)
RSIMG BW IMG
RSIMPCUR Load Exchange Rates from File
RSINPUT Manual Data Entry
RSIR_DELTATRACK KPro Delta Tracking
RSISET Maintain InfoSets
RSKC Maintaining the Permittd Extra Chars
RSLDAPSYNC_USER LDAP Synchronization of Users
RSLGMP Maintain RSLOGSYSMAP
RSMD Extractor Checker
RSMDCNVEXIT Conversn to Consistent Intern. Vals
RSMDEXITON Activate Conversion Routine
RSMO Data Load Monitor Start
RSMON BW Administrator Workbench
RSMONCOLOR Traffic light color in the Monitor
RSMONITOR_DB D&B Integration
RSMONMAIL Mail Addresses for Monitor Assistant
RSNPGTEST Test Network Plan Control
RSNPGTEST2 Test Network Plan Control
RSNSPACE BW Namespace Maintenance
RSO2 Oltp Metadata Repository
RSO3 Set Up Deltas for Master Data
RSOCONTENT Administration of a Content System
RSOCOPY Copy from TLOGO Objects
RSODADMIN Administration BW Document Managemt.
RSOR BW Metadata Repository
RSORBCT BI Business Content Transfer
RSORMDR BW Metadata Repository
RSPC Process Chain Maintenance
RSPC1 Process Chain Display
RSPCM Monitor daily process chains
RSPFPAR Display profile parameter
RSQ02 Maintain InfoSets
RSQ10 SAP Query: Role Administration
RSQ11 InfoSet Query: Web reporting
RSRAJ Starts a Reporting Agent Job
RSRAM Reporting Agent Monitor
RSRAPS Manages Page Store
RSRCACHE OLAP: Cache Monitor
RSRCATTTRACE Catt transaction for trace tool
RSREP BW Administrator Workbench
RSRFCCHK RFC destinations with logon data
RSRHIERARCHYVIRT Maintain Virtual Time Hierarchies
RSRQ Data Load Monitor for a Request
RSRR_WEB Report-Report Interface in Web
RSRT Start of the report monitor
RSRT1 Start of the Report Monitor
RSRT2 Start of the Report Monitor
RSRTRACE Set trace configuration
RSRTRACETEST Trace tool configuration
RSRV Analysis and Repair of BW Objects
RSRVALT Analysis of the BW objects
RSR_TRACE Trace Monitor
RSR_WEB_VARIABLES Variable Entry in Web
RSSCD100_PFCG Change Docs for Role Administration
RSSCD100_PFCG_USER for Role Assignment
RSSCM_APPL Application settings SCM4.0 and BW
RSSD Access for scheduler
RSSE Selection start InfoCube
RSSGPCLA Maintain program class
RSSG_BROWSER Simple Data Browser
RSSM Authorizations for Reporting
RSSMQ Start Query with User
RSSMTRACE Reporting Log Authorization
RSSTARTMON Starting the monitor in parall.proc.
RSSU53 Display authorization check BW
RST22 Old Short-Dump Overview
RSTB Choose Object Name
RSTBHIST Table history
RSTG_BUPA Target Group Sel. Business Partners
RSTG_CUST Target Group Selection Customers
RSTG_DB Target Group Selection D&B
RSTG_DB_WEB Target Group Selection D&B
RSTPRFC Create Destination for After-Import
RSU0 Update rules overview
RSU1 Create update rules
RSU1I Create update rules
RSU1O Create Update Rules
RSU2 Change update rules
RSU2I Change update rules
RSU2O Change Update Rules
RSU3 Display update rules
RSU3I Display update rules
RSU3O Display Update Rules
RSU6 Delete update rules
RSU6I Delete update rules
RSU6O Delete update rules
RSU7 Data Extraction: Maintain Parameters
RSUSR003 Check standard user passwords
RSUSR200 List of Users per Login Date
RSWELOGD Delete Event Trace
RSWEWWDHMSHOW Display Background Job SWWERRE
RSWEWWDHSHOW Display Work Item Deadline Monitorng
RSWWCLEAR Execute Work Item Clearing Work
RSWWCOND Execute Work Item Rule Monitoring
RSWWDHEX ExecuteWorkItemDeadlineMonitoring
RSWWERRE Start RSWWERRE
RSZC Copying Queries between InfoCubes
RSZDELETE Deletion of query objects
RSZT Get Test Component
RSZTESTFB Shortcut Function Test Environment
RSZV Call up of view V_RSZGLOBV
RSZVERSION Set frontend version
RS_AWB_REMOTE Remote AWB Staging
RS_BCT_BWBEOTYP Maintain BW Backend Object Types
RS_DS_CHECK Check consistency request
RS_ISTD_REMOTE Maintain InfoSource
RS_LOGSYS_CHECK Source System Tool
RS_PERS_ACTIVATE Activation of BEx Personalization
RS_PERS_BOD_ACTIVATE Activate BEx Open Pers.
RS_PERS_BOD_DEACTIVA Deactivate Pers. for BEx Open
RS_PERS_VAR_ACTIVATE Activate Variable Pers.
RS_PERS_VAR_DEACTIVA Deactivate Pers. for Variables
RS_PERS_WTE_ACTIVATE Activate Web Template Pers.
RS_PERS_WTE_DEACTIVA Deactivate Pers. for Web Template
SP01 Spool -
Record display to match log in user
I'm trying to create a log in area with user information. I
have used the update record and it does display a record but only
the first record that I input no matter what username I log in
with. I want to have the record display match the logged in users
information.Heya Joe,
Try this:
1. Restrict access to page for logged in users.
2. Create form you'll use for Update Record Server Behavior.
3. Add a hidden field in the form for primary_key name and id
= primary_key
4. Add the value for primary_key as the session login_id from
the Bindings Tab in Dreamweaver.
5. Add Update Record Server Behavior. Select primary_key form
field for integer primary key in Update Record SB and add other
form fields corresponding to update record accordingly.
6. save, put, login, visit, update logged in users record.
Hope that helps! -
Regarding Session log messages format..
Hi All,
I am reading session logs as follows,
CALL FUNCTION 'RSTS_READ'
TABLES
DATATAB = LOG_TABLE
EXCEPTIONS
FB_CALL_HANDLE = 4
FB_ERROR = 8
FB_RSTS_NOCONV = 12
FB_RSTS_OTHER = 16
OTHERS = 16.
I am getting all my messages in the Log_table,
But those messages or not formated is there any Function module to format the session log messages ??
Plz let me know if any ASAP.
Thanks
Sureshhi,
try with FM format_message, it works.
see the sample code,
data: l_message type string.
if tab_error-msgtyp = 'I' or tab_error-msgtyp = 'S'.
call function 'FORMAT_MESSAGE'
exporting
id = tab_error-msgid
lang = 'EN'
no = tab_error-msgnr
v1 = tab_error-msgv1
v2 = tab_error-msgv2
v3 = tab_error-msgv3
v4 = tab_error-msgv4
importing
msg = l_message
exceptions
not_found = 1
others = 2.
if sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
endif.
reward points if useful,
regards,
seshu. -
Session/log reorganization
Hi all,
We have R/3 system 46C-SP52/Windows and Oracle as database (9206), and we have some consistency problem in TEMSE, I've already check/delete all inconsistencies and check all the parameters linked to the output. finaly, I've check tables APQI-APQL-APQD, and using program "RSBDC_REORG" to reorganize sessions/logs the print report shows the sessions (dated year 2003) and no deletion done...so, I found that the table APQI include many entries for year 2003 and not APQL. my question : is there a tool to reorge this inconsistency because I don't want to write a oracle statment to delete all entries in this table because based on the trace the updated tables are APQD, APQI, APQL and TST01.
Regards,
Aziz K.Check this SAP Note:
https://service.sap.com/sap/support/notes/48400 -
Hi All,
I want to retrive the BDC Session Log from the already processed Session by a program.
Is there any table that holds the BDC Session Log?
Regards,
SivaSiva,
Have a look at program RSBDC_PROTOCOL. When you debug this program you can see that Function Modules RSTS_OPEN_RLC and RSTS_READ are used for reading the session logs from the Temse.
Regards,
Arjan Aalbers -
Problem with display of transport logs
Hello,
The transport logs on one of our systems are not displayed from the SAP level. The logs are available in the trans directory with the logged information. Logs for other transport requests on the same day from the same user are displayed but certian logs are not displayed.
I go into import history of a certian system and click on the display log button. In the log overview screen ntothing is displayed.
any suggestions wud be appreciated !
regards.Date of the log is 12/06/09 - there were many imports on this day and logs for all are available. We are unable to view the logs for around 10 imports. There is no such error message, once u highlight the change request in import history and click on "logs", the next screen has text "log display for <CR number>" and below this nithing is shown. The logs at OS level are readable using the unix "view" command.
-
Reverse Mapping Tutorial - Finder.java queries the wrong table?!
I have been almost successful in running the Reverse Mapping Tutorial, by
creating Java Classes from the hsqldb sample database, and running the JDO
Enhancer on them.
However, I cannot get he Finder.java to work. It seems to look in the wrong
table: MAGAZINEX instead of MAGAZINE?
Did anyone have trouble with this step, or ran it successfully?
Liviu
PS: here is the trace:
0 [main] INFO kodo.Runtime - Starting Kodo JDO version 2.4.2
(kodojdo-2.4.2-20030326-1841) with capabilities: [Enterprise Edition
Features, Standard Edition Features, Lite Edition Features, Evaluation
License, Query Extensions, Datacache Plug-in, Statement Batching, Global
Transactions, Developer Tools, Custom Database Dictionaries, Enterprise
Databases]
70 [main] WARN kodo.Runtime - WARNING: Kodo JDO Evaluation expires in 25
days. Please contact [email protected] for information on extending your
evaluation period or purchasing a license.
68398 [main] INFO kodo.MetaData -
com.solarmetric.kodo.meta.JDOMetaDataParser@19eda2c: parsing source:
file:/C:/Documents%20and%20Settings/default/jbproject/JDO/classes/reversetut
orial.jdo
74577 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] open:
jdbc:hsqldb:hsql_sample_database (sa)
75689 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] close:
com.solarmetric.datasource.PoolConnection@17918f0[[requests=0;size=0;max=70;
hits=0;created=0;redundant=0;overflow=0;new=0;leaked=0;unavailable=0]]
75699 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] close
connection
77331 [main] INFO jdbc.JDBC - Using dictionary class
"com.solarmetric.kodo.impl.jdbc.schema.dict.HSQLDictionary" to connect to
"HSQL Database Engine" (version "1.7.0") with JDBC driver "HSQL Database
Engine Driver" (version "1.7.0")
1163173 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ] open:
jdbc:hsqldb:hsql_sample_database (sa)
1163293 [main] INFO jdbc.SQL - [ C:3093871; T:31737213; D:22310332 ]
preparing statement <17940412>: SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM
MAGAZINEX
1163313 [main] INFO jdbc.SQL - [ C:3093871; T:31737213; D:22310332 ]
executing statement <17940412>: [reused=1;params={}]
1163443 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ]
close:
com.solarmetric.datasource.PoolConnection@2f356f[[requests=1;size=0;max=70;h
its=0;created=1;redundant=0;overflow=0;new=1;leaked=0;unavailable=0]]
1163443 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ] close
connection
Hit uncaught exception javax.jdo.JDOFatalDataStoreException
javax.jdo.JDOFatalDataStoreException:
com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
[SQL=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
[PRE=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
Table not found: S0002 Table not found: MAGAZINEX in statement [SELECT
DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX] [code=-22;state=S0002]
NestedThrowables:
com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
[SQL=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
[PRE=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
Table not found: S0002 Table not found: MAGAZINEX in statement [SELECT
DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
at
com.solarmetric.kodo.impl.jdbc.runtime.SQLExceptions.throwFatal(SQLException
s.java:17)
at
com.solarmetric.kodo.impl.jdbc.ormapping.SubclassProviderImpl.getSubclasses(
SubclassProviderImpl.java:283)
at
com.solarmetric.kodo.impl.jdbc.ormapping.ClassMapping.getPrimaryMappingField
s(ClassMapping.java:1093)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.executeQuery(JDBCSto
reManager.java:704)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.executeQuery(JDBCQuery.java
:93)
at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:792)
at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:595)
at reversetutorial.Finder.main(Finder.java:32)
NestedThrowablesStackTrace:
java.sql.SQLException: Table not found: S0002 Table not found: MAGAZINEX in
statement [SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
at org.hsqldb.Trace.getError(Trace.java:226)
at org.hsqldb.jdbcResultSet.<init>(jdbcResultSet.java:6595)
at org.hsqldb.jdbcConnection.executeStandalone(jdbcConnection.java:2951)
at org.hsqldb.jdbcConnection.execute(jdbcConnection.java:2540)
at org.hsqldb.jdbcStatement.fetchResult(jdbcStatement.java:1804)
at org.hsqldb.jdbcStatement.executeQuery(jdbcStatement.java:199)
at
org.hsqldb.jdbcPreparedStatement.executeQuery(jdbcPreparedStatement.java:391
at
com.solarmetric.datasource.PreparedStatementWrapper.executeQuery(PreparedSta
tementWrapper.java:93)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedQueryI
nternal(SQLExecutionManagerImpl.java:771)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQueryInternal(
SQLExecutionManagerImpl.java:691)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQuery(SQLExecu
tionManagerImpl.java:372)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQuery(SQLExecu
tionManagerImpl.java:356)
at
com.solarmetric.kodo.impl.jdbc.ormapping.SubclassProviderImpl.getSubclasses(
SubclassProviderImpl.java:246)
at
com.solarmetric.kodo.impl.jdbc.ormapping.ClassMapping.getPrimaryMappingField
s(ClassMapping.java:1093)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.executeQuery(JDBCSto
reManager.java:704)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.executeQuery(JDBCQuery.java
:93)
at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:792)
at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:595)
at reversetutorial.Finder.main(Finder.java:32)The reason I did not run importtool is because ... I actually ran it, but it
was not successfull. **!
I now tried the solutions directory, from the kodo distribution, and that
failed as well. Here is what I did:
- I went to reversetutorial/solutions, and compiled all the classes, and
then placed them into a reversetutorial folder (to match the package)
- ran "rd-importtool reversetutorial.mapping" (the mapping file from the
solutions directory), which failed as below:
0 [main] INFO kodo.MetaData - Parsing metadata resource
"file:/C:/kodo/reversetutorial/solutions/reversetutorial.mapping".
Exception in thread "main"
com.solarmetric.rd.kodo.meta.JDOMetaDataNotFoundException: No JDO metadata
was found for type "class reversetutorial.Article".
FailedObject:class reversetutorial.Article
at
com.solarmetric.rd.kodo.meta.JDOMetaDataRepositoryImpl.getMetaData(JDOMetaDa
taRepositoryImpl.java:148)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMetaData(Mapping
Repository.java:147)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(MappingR
epository.java:158)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.getMapping(ImportTo
ol.java:126)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.importMappings(Impo
rtTool.java:57)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.run(ImportTool.java
:408)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.main(ImportTool.jav
a:385)
Any idea why? The solutions directory should work, right? I even tried
specifying a kodo.properties file, but it did not seem to help.
Liviu
"Abe White" <[email protected]> wrote in message
news:[email protected]...
Running the reversemappingtool creates classes, metadata files, and a
.mapping file. That .mapping file contains all the O/R mapping
information for how the generated classes map to your existing database
tables. What the importtool does is just transfer that mapping
information to the metadata files, in the form of <extension> elements.
The reason this is a separate step will be clear once Kodo 3.0 comes out.
So in sum, the importtool does not affect the database in any way. It
just moves information from one format (.mapping file) to another
(<extension> elements in the .jdo file).
Maybe you are looking for
-
How to Copy (or) Transfer data in a BLOB Table From one database to another
Dear Members, I want to Copy or Tranfer the data in a Table having BLOB Column from one database to another database. The Problem is that I am using COPY Command. COPY from scott/[email protected] to k5esk_ldb2/k5esk_ldb2@k5_ist.world CREATE BLOB_TAB
-
Shouldn't there be jcontrol/jlaunch pid's running on an XI system?
Hi Folks - I'm a little rusty on XI (and JAVA stacks in general) and would appreciate some help. We have a NW740 XI/PI system....it's a dual stack....and so my understanding has always been that XI/PI involves/requires both ABAP stack and a JAVA sta
-
How to import AVCHD from Sony HDR-UX1
Ok, little frustrating getting this to work but here's how it worked for me. First you will need ReadDVD! from http://www.softarch.com/press/readdvdIntel1.htm ($50) This will allow your mac to mount the mini-dvd in your camera. The disk is formatted
-
Hi Everyone, I think I've shot myself in the foot by not properly following set up instructions. I have a two-fold problem. Firstly, my Sharepoint Central Administration V4 site is showing Service Not Available. This seems to be related to the cred
-
Incorrect Canon Lens Identification in Lens Correction.
All, Reported, but wanted to repeat here to see if anyone else has similar issues. EXIF in Library correctly picks up Canon 70-200 + Canon 1.4 TC. The Lens Correction module in Develop defaulted to a Canon EF-S 18 - 200 F/3.5 - 5.6. Jay S.