Archiving data in ods
Hello out there
In our company we ara going to archive transactional data for the first time
I am going to create several ods:es . In those i am going to transfer data before archiving in r3.
Now my question. I will set a flag somewhere and if this is set i will be impossible to delete the data in the ods via rsa1 transaction
I hope this is possible but dont know how to do.
/7 Stisse
Hi dear and welcome on board!
Is your doubt related to some possible authorization settings in order to avoid potential (undesired) deletions of this data ?
I think you can move these ODSs in a dedicated Infoarea and then avoid to assign access into it to anyone...
Otherwise, you can archive data in BW too !
Let me know your specific requirements (do you have to do some reporting on this archived data ?)...
Bye,
Roberto
Similar Messages
-
DBIF_REPO_SQL_ERROR short dumps while activating data in ODS
Hi All,
We are using BW3.5 with Oracle 9.2 on AIX 5.3 box, from the past few days we are getting DBIF_REPO_SQL_ERROR short dumps frequently while activating data in ODS.
Runtime Error DBIF_REPO_SQL_ERROR
Date and Time 08.01.2008 13:01:08
1. A printout of the problem description (short dump)
To obtain this, select in the current display "System->List->
Save->Local File (unconverted)".
2. A suitable printout of the system log
To obtain this, call the system log through transaction SM21.
Limit the time interval to 10 minutes before and 5 minutes
after the short dump. In the display, then select the function
"System->List->Save->Local File (unconverted)".
3. If the programs are your own programs or modified SAP programs,
supply the source code.
To do this, select the Editor function "Further Utilities->
Upload/Download->Download".
4. Details regarding the conditions under which the error occurred
or which actions and input led to the error.
Internal call code.........: "[REPO/*/43/LOAD/CX_SY_OPEN_SQL_DB=============CP
Please check the entries in the system log (Transaction SM21).
You may able to find an interim solution to the problem
in the SAP note system. If you have access to the note system yourself,
use the following search criteria:
"DBIF_REPO_SQL_ERROR" C
"SAPLRSSM_PROCESS" or "LRSSM_PROCESSF04"
"CHECK_IF_ANALYZE_IS_NESSESSARY"
System environment
SAP Release.............. "640"
Application server....... "psapdb"
Network address.......... "158.58.65.11"
Operating system......... "AIX"
Release.................. "5.3"
Hardware type............ "00CD615C4C00"
Character length......... 16 Bits
Pointer length........... 64 Bits
Work process number...... 22
Short dump setting....... "full"
Database server.......... "psapdb"
Database type............ "ORACLE"
Database name............ "BWP"
Database owner........... "SAPBWP"
Character set............ "C"
SAP kernel............... "640"
Created on............... "Oct 29 2006 20:49:57"
Created in............... "AIX 1 5 00538A4A4C00"
Database version......... "OCI_920 "
Patch level.............. "155"
Patch text............... " "
Supported environment....
Database................. "ORACLE 9.2.0.., ORACLE 10.1.0.., ORACLE
10.2.0.."
SAP database version..... "640"
Operating system......... "AIX 1 5, AIX 2 5, AIX 3 5"
Memory usage.............
Roll..................... 16192
EM....................... 16759424
Heap..................... 0
Page..................... 24576
MM Used.................. 6604384
MM Free.................. 1772536
SAP Release.............. "640"
User and Transaction
Client.............. 200
User................ "R3REMOTE"
Language key........ "E"
Transaction......... " "
Program............. "SAPLRSSM_PROCESS"
Screen.............. "SAPMSSY0 1000"
Screen line......... 6
Information on where terminated
The termination occurred in the ABAP program "SAPLRSSM_PROCESS" in
"CHECK_IF_ANALYZE_IS_NESSESSARY".
The main program was "RSPROCESS ".
The termination occurred in line 1143 of the source code of the (Include)
program "LRSSM_PROCESSF04"
of the source code of program "LRSSM_PROCESSF04" (when calling the editor
11430).
The program "SAPLRSSM_PROCESS" was started as a background job.
Job name........ "BI_PROCESS_ODSACTIVAT"
Job initiator... "RPRIYANKA"
Job number...... 02010102
Also we have a failed job here. Here is the job log.
Job log overview for job: BI_PROCESS_ODSACTIVAT / 02010102
Date Time Message text
08.01.2008 13:01:00 Job started
08.01.2008 13:01:00 Step 001 started (program RSPROCESS, variant &0000000000188, user ID R3REMOTE)
08.01.2008 13:01:02 Activation is running: Data target HBCS_O25, from 1,758 to 1,767
08.01.2008 13:01:02 Data to be activated successfully checked against archiving objects
08.01.2008 13:01:02 SQL: 01/08/2008 13:01:02 R3REMOTE
08.01.2008 13:01:02 ANALYZE TABLE "/BIC/AHBCS_O2540" DELETE
08.01.2008 13:01:02 STATISTICS
08.01.2008 13:01:02 SQL-END: 01/08/2008 13:01:02 00:00:00
08.01.2008 13:01:02 SQL: 01/08/2008 13:01:02 R3REMOTE
08.01.2008 13:01:02 BEGIN DBMS_STATS.GATHER_TABLE_STATS ( OWNNAME =>
08.01.2008 13:01:02 'SAPBWP', TABNAME => '"/BIC/AHBCS_O2540"',
08.01.2008 13:01:02 ESTIMATE_PERCENT => 1 , METHOD_OPT => 'FOR ALL
08.01.2008 13:01:02 INDEXED COLUMNS SIZE 75', DEGREE => 1 ,
08.01.2008 13:01:02 GRANULARITY => 'ALL', CASCADE => TRUE ); END;
08.01.2008 13:01:05 SQL-END: 01/08/2008 13:01:05 00:00:03
08.01.2008 13:01:05 SQL-ERROR: 603 ORA-00603: ORACLE server session terminated by fat al error
08.01.2008 13:01:05 System error: RSDU_ANALYZE_TABLE_ORA/ ORACLE
08.01.2008 13:01:08 ABAP/4 processor: DBIF_REPO_SQL_ERROR
08.01.2008 13:01:08 Job cancelled
Listener is working fine, Checked the RFC connections, Tried restarting the system, tried adding space to the TEMP tablespace as well as PSAPBWP, but they didn't work.
Please help.The problem got solved as there were DB errors like ORA-01114: IO error writing block to file 256 (block # 126218).
Here the point to be notes is file 256. The number greater than 255 indicates a temp datafile. This indicates an issue with PSAPTEMP tablespace. When checked on PSAPTEMP tablespace, one of the filesystems where one temp data file sitting was full. The filesystem was 100% full. This will prevent the temp datafile to grow and shrink as and when required.
So, adding more space to the filesystem solved the problem.
Thanks to everyone who have shown interest in solving my problem. -
Dear all,
Can anyone pls tell me how to view the Archived data like same what we see in data target.
I have used ODS as data target ,after archiving i want to see the same data format what we see on ODS Active data.
Pls tell me how the data will be saved during archive and what format will it be saved and how to view it from the archived files.
Thanks in Advance.
Thanks
GomangoHi
Use the following tables to suite your objectives
RSARCHIPROLOCSEL BW Archiving: Archived Data Area
RSARCHIPRO BW Archiving: General Archiving Properties
RSARCHIPROIOBJ BW Archiving: General Archiving Properties
RSARCHIPROLOC BW ARchiving: General Local Properties
RSARCHIPROPID BW Archiving: Program References of InfoProvider
RSARCHREQ BW Archiving: Archiving Request
RSARCHREQFILES BW Archiving: Verfified Archive Files
RSARCHREQSEL BW Archiving: Request-Selections
Santosh -
BW on HANA, Archive data to Hadoop
Dear All,
We are planning to start a poc for one of our clients. Below is the scenario.
use BW on HANA for real time analytics and Hadoop as a cold storage.
Archive historical data to HAdoop
report on HANA and Hadoop.
Access Hadoop data using SDA
I request you to provide implementation steps if somebody have worked on similar scenario.
Thanks & Regards,
Rajeev BikkaniHi Rajeev Bikkani,
Currently NLS using HADOOP is not available by default and SAP highly recommends IQ for NLS.If you opt for IQ in longer run it will be easy to maintain and scaling up and also for better query performance.In longer run it will yield you better ROI. Initially HADOOP will be cost efective but amount of time user spends to get this solution will be challenging and later on maintaining it and scaling it up also be very challenging.. So SAP highly recommeds IQ for a NLS. SAP positions Hadoop to handle Big data along with HANA for mainly handling unstructured data and not for NLS. So please reconsider your option.
I went through the link and I dont agree with the point "Archiving was not an option, as they needed to report on all data for trending analysis. NLS required updating the info providers and bex queries to access NLS data. SAP offers Sybase IQ as a NLS option but it doesn't come cheap with a typical cost well over a quarter million dollars."
Because you can query on the archived data and it doesnt need data to be written back to providers, during runtime data will be picked from NLS and showcased.
If you still wanted to use HADOOP as NLS , then I can suggest you an process but I have not tried it personnaly.
1)Extract data selectively from your infoprovider via OHD and keep it in OHD table.
2)Write the data from the OHD table to Hbase table .(Check the link below for how to do it.)
3)Delete the data fron the OHD table.
4)Whatever data moved to OHD table should be deleted from the infoprovider by slective deletion.
5)Now Connect the HANA to HADOPP via SDA and virtualise teh table to which we have written the data.
6) Then build a view on top of this table and query on it.
7)HANA view historical data can be combined with BW provider data via open ODS or Composite provider or Transient provider
Reading and Writing to HADOOP HBASE with HANA XS
http://scn.sap.com/community/developer-center/hana/blog/2014/06/03/reading-and-writing-to-hadoop-hbase-with-hana-xs
Hope this helps..
Thanks & Regards
A.Dinesh -
Hey Folks,
I load data into an ODS, weekly. The data contains status information of a product.
The source system delivers only data where the status has changed.
If the status for on product is constant over two or more weeks, I get no record of this product for the current week. But I want to display the status for this product in the current week.
So I need to insert records in my ODS for all products where I don't get a record from the source system for the current week.
I got the information that I need a start routine and implement my logic with ABAP
There are any other ways, or can somebody explain me, how to do this with ABAP?
Thanks for your help!
StephanHi Stephan,
Please try this logic.
Load the transaction data for the ODS weekly as per you source sytem data.
Now you need to update based on constant status. You can write a ABAP program with the below logic.
Create internal tables for Master Data and ODS and read all the contents into it.
Get the details from Master Data and manually update the status field (No data from source system)and store it in a internal table (Internal table - Same strcuture as that of active records table /BIC/AZ***)which has records to be appended to the ODS.
Append it to the active records table of the ODS.
Hope this helps.
Thanks,
Srinivas. -
Hi Gurus,
Im loading data from ods to ods thru full update and after that I have to activate that ODS,
Kindly let me know the steps for activating the ODS,
right click on the ods and press activate data n ods
or before i have to do Update ods in data target.
Raj1ST we have to go for ODS in Data target and right click on the ODS.
In that options click on Generate Export Data Source.Then go to Source system to identify the Data Source.
It will starts with 8 later What everyou had given the ODS name you have to give in Search.
Later you have to create infopackage for that ODS and load data in to that IP.
next come to Data Target and create the new ODS.
Give the update rules with the data source what ever you had created now.
later you have to create info Package for that ODS and load the data for that IP.
Hope it will helps you.......... -
Hi
I have got one ODS, for which my infopackage is desigen as do not update data if no master data exists for a characteristic.
But one of my request which is having only the entries in master data, is not getting activated in ODS.Its failing with the error "No SID found for value '00DU023R ' of characteristic 0MATERIAL". but the material 00DU023R is not there in the request at all.
Doest anyone phased this scenario before? how to activate the data in ODS?
Regards
SriramHi Sriram,
What SSree is trying to see.
Go to your ODS, right click manage.
In the requests tab, choose a very old date and choose refresh. Now check if you see any red requests. These requests will prevent from activating.
On the other hand, when I see your first error, I would go to transaction RSRV, and do some consistency checks on your ODS and master data.
kr,
Tom -
Hi Friends,
I dont see any data in my ODS, My loading went fine, then I went to my ODS , Right click > Activate data in ODS , what must be the problem??
thanks in advance,ODS >Context Menu>Manage
Go to content tab, check the tables (active, new). It should be in one of the tables.
Is activation over? -
HI Kudos!
Please let me know the methods of deleting the data in ODS of all the tables before and after activation. Let me know if any blogs are there.
thanks in advance.
kalyan Pidugu.Hi Kalyan,
As Malli mentioned those are the ways you can delete the data...again it depends on the update mode used for data loads. If it is a delta load from R3 make sure u mark it as red before deleting so you dont loose data from subsequent deltas.
if it is full update and if u wish to delete it u can simply delete it coz BW will mark it as red.
What is it do u want to know abt activation?
Assign points if useful
Arun -
Hello,
I am trying to load data from r/3 into the ODS.
When i check in the monitor its saying that
07:37:12 ( 549741 from 549741 Records )
But its not turning green.Its still yellow.
I checked the new data in the ODS. The number of entries in it is 0.Hi
You can view the data in ODS only after the status is green.
Check if it is showing any error in the description.
If not then wait for some time. we can't do anything as is it showing yellow(in process) status
Cheers
SM -
Hai Experts,
Can any body tell me data is present in active data table, new data table, change log table and as wel as with this there is a content button it is asking for the table name what is that button consists of can any body help me out
Regards,
VikramHi Vikram,
If we right click ODS --> go to Manage --> contents Tab
You will find three Tables
1. New Table : when you load Data initially it gets into New Table and if you click on that you can see the Content
2. Active and Change Log : when you Activate data in ODS the Data moves from new to active and change log Tables and used for futher processsing to other Data Targets like Cube or ODs again.
Change Log Table records all the changes happend to the data.
Check this blog for more info
/people/raj.alluri/blog/2006/11/24/the-tech-details-of-standard-ods-dso-in-sap-dwh
Hope this helps,
Sudhakar. -
Data in ODS, Info cube ans Multiprovider(List cube) are in Sync.
Hi,
My query is built on multiprovider. The data flow is data source u2013 ODS then ODS u2013 Info cube and multiprovider contains Info cube only.
Data in ODS, Info cube ans Multiprovider(List cube) are in Sync.
The query results are not tie up ODS, Info cube ans Multiprovider(List cube).
Any one let me know why this is happening and how do I resolve it.
Regards,
Sharma.HI;
thanks for help.
I resolved the issue in my own.
Regards,
Sharma. -
Errors when loading data to ODS
Hi,
I am getting the following dump when loading data to ODS
whta might be the problem
Runtime Error MESSAGE_TYPE_X
Date and Time 29.09.2006 14:26:52
ShrtText
The current application triggered a termination with a short dump.
What happened?
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
What can you do?
Print out the error message (using the "Print" function)
and make a note of the actions and input that caused the
error.
To resolve the problem, contact your SAP system administrator.
You can use transaction ST22 (ABAP Dump Analysis) to view and administer
termination messages, especially those beyond their normal deletion
date.
is especially useful if you want to keep a particular message.
Error analysis
Short text of error message:
Test message: SDOK_GET_PHIO_ACCESS 001
Technical information about the message:
Message classe...... "1R"
Number.............. 000
Variable 1.......... "SDOK_GET_PHIO_ACCESS"
Variable 2.......... 001
Variable 3.......... " "
Variable 4.......... " "
Variable 3.......... " "
Variable 4.......... " "
How to correct the error
Probably the only way to eliminate the error is to correct the program.
You may able to find an interim solution to the problem
in the SAP note system. If you have access to the note system yourself,
use the following search criteria:
"MESSAGE_TYPE_X" C
"SAPLSDCL" or "LSDCLF00"
"INTERNAL_ERROR"
If you cannot solve the problem yourself and you wish to send
an error message to SAP, include the following documents:
1. A printout of the problem description (short dump)
To obtain this, select in the current display "System->List->
Save->Local File (unconverted)".
2. A suitable printout of the system log
To obtain this, call the system log through transaction SM21.
Limit the time interval to 10 minutes before and 5 minutes
after the short dump. In the display, then select the function
"System->List->Save->Local File (unconverted)".
3. If the programs are your own programs or modified SAP programs,
supply the source code.
To do this, select the Editor function "Further Utilities->
Upload/Download->Download".
4. Details regarding the conditions under which the error occurred
or which actions and input led to the error.
System environment
SAP Release.............. "640"
Application server....... "bomw093a"
Network address.......... "132.186.125.66"
Operating system......... "Windows NT"
Release.................. "5.2"
Hardware type............ "4x Intel 801586"
Character length......... 8 Bits
Pointer length........... 32 Bits
Work process number...... 16
Short dump setting....... "full"
Database server.......... "BOMW093A"
Database type............ "ORACLE"
Database name............ "BIW"
Database owner........... "SAPDAT"
Character set............ "English_United State"
SAP kernel............... "640"
Created on............... "Nov 4 2004 23:26:03"
Created in............... "NT 5.0 2195 Service Pack 4 x86 MS VC++ 13.10"
Database version......... "OCI_920_SHARE "
Patch level.............. "43"
Patch text............... " "
Supported environment....
Database................. "ORACLE 8.1.7.., ORACLE 9.2.0.."
SAP database version..... "640"
Operating system......... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2"
Memory usage.............
Roll..................... 8112
EM....................... 6271776
Heap..................... 0
Page..................... 24576
MM Used.................. 3921120
MM Free.................. 258392
SAP Release.............. "640"
User and Transaction
Client.............. 800
User................ "IC881147"
Language key........ "E"
Transaction......... " "
Program............. "SAPLSDCL"
Screen.............. "SAPMSSY0 1000"
Screen line......... 6
Information on where terminated
The termination occurred in the ABAP program "SAPLSDCL" in "INTERNAL_ERROR".
The main program was "RSRD_BROADCAST_PROCESSOR ".
The termination occurred in line 25 of the source code of the (Include)
program "LSDCLF00"
of the source code of program "LSDCLF00" (when calling the editor 250).
The program "SAPLSDCL" was started as a background job.
Job name........ "SECOQUERY"
Job initiator... "IC881147"
Job number...... 14265102
Source Code Extract
Line
SourceCde
1
2
INCLUDE LSDCLF00 *
3
4
5
6
FORM INTERNAL_ERROR *
7
8
Handles unexpected error conditions (internal errors)
9
10
--> VALUE(U_ROUTINE) Routine/function module where error occured
11
--> VALUE(U_ERROR_CODE) Identifier in routine (e.g. number)
12
--> VALUE(U_VAR1) Variable containing further information
13
--> VALUE(U_VAR2) Variable containing further information
14
--> VALUE(U_VAR3) Variable containing further information
15
--> VALUE(U_VAR4) Variable containing further information
16
17
form internal_error
18
using value(u_routine)
19
value(u_error_code)
20
value(u_var1)
21
value(u_var2)
22
value(u_var3)
23
value(u_var4).
24
>>>>>
message x000 with u_routine u_error_code u_var1 u_var2.
26
27
endform.
28
29
30
*& Form BAD_OBJECT_TO_SYMSG
31
32
maps error information in u_bad_object into system message
33
variables
34
35
--> VALUE(U_BAD_OBJECT) structure containing error information
36
37
form bad_object_to_symsg
38
using value(u_bad_object) type sdokerrmsg.
39
40
sy-msgid = u_bad_object-id.
41
sy-msgty = u_bad_object-type.
42
sy-msgno = u_bad_object-no.
43
sy-msgv1 = u_bad_object-v1.
44
sy-msgv2 = u_bad_object-v2.
Contents of system fields
Name
Val.
SY-SUBRC
0
SY-INDEX
0
SY-TABIX
1
SY-DBCNT
4
SY-FDPOS
0
SY-LSIND
0
SY-PAGNO
0
SY-LINNO
1
SY-COLNO
1
SY-PFKEY
SY-UCOMM
SY-TITLE
Report Dissemaintion Framework: Executing the Transferred Settings
SY-MSGTY
X
SY-MSGID
1R
SY-MSGNO
000
SY-MSGV1
SDOK_GET_PHIO_ACCESS
SY-MSGV2
001
SY-MSGV3
SY-MSGV4
Active Calls/Events
No. Ty. Program Include Line
Name
15 FORM SAPLSDCL LSDCLF00 25
INTERNAL_ERROR
14 FORM SAPLSDCI LSDCIU13 303
PHIO_GET_CONTENT_ACCESS
13 FUNCTION SAPLSDCI LSDCIU13 113
SDOK_PHIO_GET_CONTENT_ACCESS
12 FUNCTION SAPLSKWF_CONTENT LSKWF_CONTENTU02 63
SKWF_PHIO_CONTENT_ACCESS_GET
11 METHOD CL_RSRA_KWF_UTILITIES=========CP CL_RSRA_KWF_UTILITIES=========CM00B 50
CL_RSRA_KWF_UTILITIES=>COPY_MIME_TO_FOLDER
10 METHOD CL_RSRA_KWF_TMPL==============CP CL_RSRA_KWF_TMPL==============CM002 28
CL_RSRA_KWF_TMPL=>GET_STYLESHEET
9 METHOD CL_RSRA_KWF_TMPL==============CP CL_RSRA_KWF_TMPL==============CM001 227
CL_RSRA_KWF_TMPL=>CONSTRUCTOR
8 METHOD CL_RSRA_ENGINE_BC=============CP CL_RSRA_ENGINE_BC=============CM010 9
CL_RSRA_ENGINE_BC=>SET_TEMPLATE_FOLDER
7 METHOD CL_RSRA_ENGINE_BC=============CP CL_RSRA_ENGINE_BC=============CM001 75
CL_RSRA_ENGINE_BC=>CONSTRUCTOR
6 METHOD CL_RSRA_JOB===================CP CL_RSRA_JOB===================CM003 47
CL_RSRA_JOB=>EXECUTE_SINGLE
5 METHOD CL_RSRA_JOB===================CP CL_RSRA_JOB===================CM00E 14
CL_RSRA_JOB=>EXECUTE_SINGLE_RC
4 METHOD CL_RSRD_PRODUCER_RA===========CP CL_RSRD_PRODUCER_RA===========CM001 147
CL_RSRD_PRODUCER_RA=>IF_RSRD_F_PRODUCER_RT~PRODUCE
3 METHOD CL_RSRD_SETTING===============CP CL_RSRD_SETTING===============CM005 28
CL_RSRD_SETTING=>EXECUTE_NODES
2 METHOD CL_RSRD_SETTING===============CP CL_RSRD_SETTING===============CM002 73
CL_RSRD_SETTING=>EXECUTE
1 EVENT RSRD_BROADCAST_PROCESSOR RSRD_BROADCAST_PROCESSOR 197
START-OF-SELECTION
Chosen variables
Name
Val.
No. 15 Ty. FORM
Name INTERNAL_ERROR
U_ROUTINE
SDOK_GET_PHIO_ACCESS
54445445554445444455
34FBF754F089FF133533
SY-MSGV1
SDOK_GET_PHIO_ACCESS
54445445554445444455222222222222222222222222222222
34FBF754F089FF133533000000000000000000000000000000
U_ERROR_CODE
001
333
001
SY-MSGV2
001
33322222222222222222222222222222222222222222222222
00100000000000000000000000000000000000000000000000
U_VAR1
2
0
SY-MSGV3
22222222222222222222222222222222222222222222222222
00000000000000000000000000000000000000000000000000
SDOKI_MODE_DELETE
5
0000
5000
U_VAR2
2
0
SY-MSGV4
22222222222222222222222222222222222222222222222222
00000000000000000000000000000000000000000000000000
No. 14 Ty. FORM
Name PHIO_GET_CONTENT_ACCESS
SYST
####################################################A#######P###############è#################
0000000000000000000000000000000000000000000000000000400000005000000000000000E00000000000000000
0000000010002000000000000000000000000000400000001000100010000000000000000000840000000000000000
PHIO_OBJECT_ID-CLASS
2222222222
0000000000
DUMMY_VERSTYPE
0
3
0
DUMMY_FCT_EXPORT
222222222222222222222222222222
000000000000000000000000000000
DUMMY_FCT_IMPORT
222222222222222222222222222222
000000000000000000000000000000
DUMMY_FCT_DELETE
222222222222222222222222222222
000000000000000000000000000000
FCT_VIEW
222222222222222222222222222222
000000000000000000000000000000
BUFF_XPIRE
000000000000
333333333333
000000000000
NO_BUFFER
2
0
SUBRC_AUX
3
0000
3000
SDOKA_EVENT_PH_FROM_REL_PRE
0022
3333
0022
SCREEN
%_17SNS0001592815_%_%_%_%_%_%_
2533545333333333352525252525252222222222222222222222222222222222222222222222222222222222222222
5F173E30001592815F5F5F5F5F5F5F0000000000000000000000000000000000000000000000000000000000000000
<%_TABLE_SDOKSTRE>
SY-XFORM
CONVERSION_EXIT
444545544454545222222222222222
3FE65239FEF5894000000000000000
CONTEXT[]
Table IT_5544[0x89]
FUNCTION=SKWF_PHIO_CONTENT_ACCESS_GETDATA=CONTEXT[]
Table reference: 324
TABH+ 0(20) = 00000000801CE03C0000000044010000A8150000
TABH+ 20(20) = 0000000059000000FFFFFFFF047B0200D00E0000
TABH+ 40( 8) = 10000000C1248000
store = 0x00000000
ext1 = 0x801CE03C
shmId = 0 (0x00000000)
id = 324 (0x44010000)
label = 5544 (0xA8150000)
fill = 0 (0x00000000)
leng = 89 (0x59000000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000058
occu = 16 (0x10000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 0
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = Not allocated
pghook = Not allocated
idxPtr = Not allocated
refCount = Not allocated
tstRefCount = Not allocated
lineAdmin = Not allocated
lineAlloc = Not allocated
store_id = Not allocated
shmIsReadOnly = Not allocated
>>>>> 1st level extension part <<<<<
regHook = 0x00000000
hsdir = 0x00000000
ext2 = 0x381CE03C
>>>>> 2nd level extension part <<<<<
tabhBack = 0x98DC033D
delta_head = 000000000000000000000000000000000000000000000000000000000000000000000000
pb_func = 0x00000000
pb_handle = 0x00000000
CONTEXT
22222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
PROPERTIES[]
Table[initial]
PROPERTIES
22222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
No. 13 Ty. FUNCTION
Name SDOK_PHIO_GET_CONTENT_ACCESS
ALLOW_MODEL
2
0
CACHE_SERVER
0000000000
2222222222222222222222222222222222222222222222222222222222222222333333333322222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
CLIENT
800
333
800
CONTENT_ONLY
X
5
8
CONTENT_OR_URL_ONLY
2
0
OBJECT_ID
222222222222222222222222222222222222222222
000000000000000000000000000000000000000000
RAW_MODE
2
0
TEXT_AS_STREAM
X
5
8
USE_URL_AT
2
0
ACCESS_MODE
00
33
00
DOCUMENT_HTTPS_URL
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
DOCUMENT_URL
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
MODEL_RETURNED
2
0
COMPONENTS[]
Table[initial]
COMPONENT_ACCESS[]
Table IT_5533[0x8616]
CLASS=CL_RSRA_KWF_UTILITIESMETHOD=COPY_MIME_TO_FOLDERDATA=L_T_COMPONENT_ACCESS
Table reference: 337
TABH+ 0(20) = 00000000481BE03C00000000510100009D150000
TABH+ 20(20) = 00000000A8210000FFFFFFFF04C5010088190000
TABH+ 40( 8) = 01000000C1248000
store = 0x00000000
ext1 = 0x481BE03C
shmId = 0 (0x00000000)
id = 337 (0x51010000)
label = 5533 (0x9D150000)
fill = 0 (0x00000000)
leng = 8616 (0xA8210000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000107
occu = 1 (0x01000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 0
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = Not allocated
pghook = Not allocated
idxPtr = Not allocated
refCount = Not allocated
tstRefCount = Not allocated
lineAdmin = Not allocated
lineAlloc = Not allocated
store_id = Not allocated
shmIsReadOnly = Not allocated
>>>>> 1st level extension part <<<<<
regHook = 0x00000000
hsdir = 0x00000000
ext2 = 0x001BE03C
>>>>> 2nd level extension part <<<<<
tabhBack = 0x985D043D
delta_head = 000000000000000000000000000000000000000000000000000000000000000000000000
pb_func = 0x00000000
pb_handle = 0x00000000
CONTEXT[]
Table IT_5544[0x89]
FILE_CONTENT_ASCII[]
Table IT_5534[0x1022]
CLASS=CL_RSRA_KWF_UTILITIESMETHOD=COPY_MIME_TO_FOLDERDATA=L_T_FILE_CONTENT_ASCII
Table reference: 312
TABH+ 0(20) = 00000000B01BE03C00000000380100009E150000
TABH+ 20(20) = 00000000FE030000FFFFFFFF04C50100D81A0000
TABH+ 40( 8) = 10000000C1248000
store = 0x00000000
ext1 = 0xB01BE03C
shmId = 0 (0x00000000)
id = 312 (0x38010000)
label = 5534 (0x9E150000)
fill = 0 (0x00000000)
leng = 1022 (0xFE030000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000113
occu = 16 (0x10000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 0
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = Not allocated
pghook = Not allocated
idxPtr = Not allocated
refCount = Not allocated
tstRefCount = Not allocated
lineAdmin = Not allocated
lineAlloc = Not allocated
store_id = Not allocated
shmIsReadOnly = Not allocated
>>>>> 1st level extension part <<<<<
regHook = 0x00000000
hsdir = 0x00000000
ext2 = 0x681BE03C
>>>>> 2nd level extension part <<<<<
tabhBack = 0xB813043D
delta_head = 000000000000000000000000000000000000000000000000000000000000000000000000
pb_func = 0x00000000
pb_handle = 0x00000000
FILE_CONTENT_BINARY[]
Table IT_5535[0x1022]
CLASS=CL_RSRA_KWF_UTILITIESMETHOD=COPY_MIME_TO_FOLDERDATA=L_T_FILE_CONTENT_BINARY
Table reference: 325
TABH+ 0(20) = 00000000181CE03C00000000450100009F150000
TABH+ 20(20) = 00000000FE030000FFFFFFFF04C50100101B0000
TABH+ 40( 8) = 10000000C1248000
store = 0x00000000
ext1 = 0x181CE03C
shmId = 0 (0x00000000)
id = 325 (0x45010000)
label = 5535 (0x9F150000)
fill = 0 (0x00000000)
leng = 1022 (0xFE030000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000114
occu = 16 (0x10000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 0
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = Not allocated
pghook = Not allocated
idxPtr = Not allocated
refCount = Not allocated
tstRefCount = Not allocated
lineAdmin = Not allocated
lineAlloc = Not allocated
store_id = Not allocated
shmIsReadOnly = Not allocated
>>>>> 1st level extension part <<<<<
regHook = 0x00000000
hsdir = 0x00000000
ext2 = 0xD01BE03C
>>>>> 2nd level extension part <<<<<
tabhBack = 0xC08FD33C
delta_head = 000000000000000000000000000000000000000000000000000000000000000000000000
pb_func = 0x00000000
pb_handle = 0x00000000
PROPERTIES[]
Table[initial]
COMPONENT_ACCESS
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
FILE_CONTENT_ASCII
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
FILE_CONTENT_BINARY
0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.1.
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
OBJECT_ID_AUX
222222222222222222222222222222222222222222
000000000000000000000000000000000000000000
X_DOCUMENT_URL
X
5
8
X_DOCUMENT_HTTPS_URL
X
5
8
X_COMPONENT_ACCESS
X
5
8
SUBRC_AUX
0
0000
0000
%_DUMMY$$
2222
0000
%_SPACE
2
0
SY-REPID
SAPLSDCI
5454544422222222222222222222222222222222
310C343900000000000000000000000000000000
No. 12 Ty. FUNCTION
Name SKWF_PHIO_CONTENT_ACCESS_GET
CACHE_SERVER
0000000000
2222222222222222222222222222222222222222222222222222222222222222333333333322222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
PHIO
2222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000
USE_URL_AT
2
0
X_ALLOW_MODEL
2
0
X_CONTENT_ONLY
X
5
8
X_CONTENT_OR_URL_ONLY
2
0
X_RAW_MODE
2
0
X_TEXT_AS_STREAM
X
5
8
ACCESS_MODE
00
33
00
DOCUMENT_HTTPS_URL
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
DOCUMENT_URL
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
ERROR
000
2222222222222222222223332222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
X_MODEL_RETURNED
2
0
COMPONENT_ACCESS[]
Table IT_5533[0x8616]
CONTEXT[]
Table IT_5544[0x89]
FILE_CONTENT_ASCII[]
Table IT_5534[0x1022]
FILE_CONTENT_BINARY[]
Table IT_5535[0x1022]
PROPERTIES[]
Table[initial]
SKWFA_C_ACT_READ
03
33
03
SYST-REPID
SAPLSKWF_CONTENT
5454545454445445222222222222222222222222
310C3B76F3FE45E4000000000000000000000000
%_ARCHIVE
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
PHIO+1(42)
222222222222222222222222222222222222222222
000000000000000000000000000000000000000000
CL_ABAP_TABLEDESCR=>TABLEKIND_STD
S
5
3
SKWFC_YES
X
5
8
No. 11 Ty. METHOD
Name CL_RSRA_KWF_UTILITIES=>COPY_MIME_TO_FOLDER
I_S_MIME_IO
FM_FOLDER 3AA00E1E0D0E3DCBE10000000A1144B5
4454444452234433434343434444333333334333343
6DF6FC4520031100515040534325100000001114425
I_S_FOLDER_IO
FBW_FLD 08BSWBLVV6N1IJCIG3VHEX2H8
4455444222233455445534344444354453432222222
627F6C40000082372C666E19A397368582880000000
L_T_PROPERTY_REQUEST
Table IT_5512[1x25]
CLASS=CL_RSRA_KWF_UTILITIESMETHOD=COPY_MIME_TO_FOLDERDATA=L_T_PROPERTY_REQUEST
Table reference: 316
TABH+ 0(20) = 28D5D23C68D5D23C000000003C01000088150000
TABH+ 20(20) = 0100000019000000FFFFFFFF04C5010038110000
TABH+ 40( 8) = 10000000C1248400
store = 0x28D5D23C
ext1 = 0x68D5D23C
shmId = 0 (0x00000000)
id = 316 (0x3C010000)
label = 5512 (0x88150000)
fill = 1 (0x01000000)
leng = 25 (0x19000000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000069
occu = 16 (0x10000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 1
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = 0xC00BE03C
pghook = 0x00000000
idxPtr = 0x00000000
refCount = 0 (0x00000000)
tstRefCount = 0 (0x00000000)
lineAdmin = 16 (0x10000000)
lineAlloc = 16 (0x10000000)
store_id = 3644 (0x3C0E0000)
shmIsReadOnly = 0 (0x00000000)
>>>>> 1st level extension part <<<<<
regHook = 0x00000000
hsdir = 0x00000000
ext2 = 0x600DE03C
>>>>> 2nd level extension part <<<<<
tabhBack = 0x98D5033D
delta_head = 000000000000000000000000000000000000000000000000000000000000000000000000
pb_func = 0x00000000
pb_handle = 0x00000000
L_S_PROPERTY_REQUEST
KW_RELATIVE_URL
4555444545455542222222222
B7F25C14965F52C0000000000
L_T_PHIO
Table IT_5494[1x43]
CLASS=CL_RSRA_KWF_UTILITIESMETHOD=COPY_MIME_TO_FOLDERDATA=L_T_PHIO
Table reference: 309
TABH+ 0(20) = E8D4D23CE0D3D23C000000003501000076150000
TABH+ 20(20) = 010000002B000000FFFFFFFF04C5010018120000
TABH+ 40( 8) = 10000000C1248000
store = 0xE8D4D23C
ext1 = 0xE0D3D23C
shmId = 0 (0x00000000)
id = 309 (0x35010000)
label = 5494 (0x76150000)
fill = 1 (0x01000000)
leng = 43 (0x2B000000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000073
occu = 16 (0x10000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 0
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = 0x2010E03C
pghook = 0x00000000
idxPtr = 0x00000000
refCount = 0 (0x00000000)
tstRefCount = 0 (0x00000000)
lineAdmin = 16 (0x10000000)
lineAlloc = 16 (0x10000000)
store_id = 3643 (0x3B0E0000)
shmIsReadOnly = 0 (0x00000000)Hi priya,
Not sure: check syntax in your Update Roules, also at level of start routine.
Ciao.
Riccardo. -
URGENT!!!! - Reading Archived Data from a Report.
Hi,
The data, prior to 15 months, of some tables (BSEG, BSAK),
used in the report YGF11347 has been archived.
Previously this report YGF11347 was run for vendor payment information for any time frame
through FBL1N. So any vendor line items from 2004, 2005, 2006 and current year,
all data elements were available to be extracted and displayed on the report.
Recently the Check Amount is missing when executing the report for payments made prior to
04/01/2006. This is because of archiving the data prior to 15 months.
My requirement is tht i need to display this archived data also in the report "YGF11347".
Can anyone please help me on this issue.
Regards,
Akratihi
good
check this link, hope this would help you to solve your problem.
http://www.ams.utoronto.ca/Assets/output/assets/ixos_637070.pdf.pdf
thanks
mrutyun^ -
Changing the default to location of Archiver data
Hi,
Is there any way that we can change the default location of the Archiver data. By default is goes to archives folder, i want it to goto some other folder.
Similarly when u take up the backup of site. i want it also to be stored in some other folder.
is it possible.???For archiver see My Oracle Support Note
How can I move the archive directory? (Doc ID 449054.1)
Not sure what you mean by a back up. There is no built in Disaster recovery tool for UCM. Standard system back ups (file system and DB) for restore should be used.
Maybe you are looking for
-
Notifiers: have it run more than once.
Hi, I just started learning about notifiers and occurrences by reading through the forums and looking at the labview examples. I need to use the functionality of notifiers/occurrences in my VI and I choose to use notifiers. I will give a slight des
-
How to transfer video files from Mac Mini to iPad
What is the best method to copy a video from a mac to the iPad? Please lis steps necessary.
-
Some iPhone Calendar Events missing
Have MobileMe (formerly Mac.com) for several years. Since last fall (2009) have tried to trust the 'Cloud' syncing Contacts and Calendars over the air. Worked for the most part. Lately have seen missing items or events on Calendar. Notice today that
-
Dreamweaver 8 Update 8.0.2
Trying to see if anyone can tell me where to get a trial version of Dreamweaver 8 (I know, it's arcaic) but I called Customer Support and they told me to use the Forums to send an email. Not sure this will help much, but I am trying to fix Dreamweav
-
Macbook wont read my hardive?
I have a 1TB hardrive,seagate, and been using it for my backups. I installed Lion for all my laptops and computers and hardrive works fine afterwards. Suddenly the hardrive no longer can be read by all my desktops and laptop especially. Is it the har