Regarding the Status of Loaded Data in Report.
Hi All,
My client wants to view the status of the data that is being represented in the Reports. For example we are having data till today in R/3 but we have only loaded data till yesterday then he wants it to be represented in the report. Like Data as on May 8th . Is there any script that can be written to get this information dynamically in the report header???
hi Yj,
in web you can use web item 'text element' and choose element type 'general text' and element id ROLLUPTIME.
<object>
<param name="OWNER" value="SAP_BW">
<param name="CMD" value="GET_ITEM">
<param name="NAME" value="TextElements_1">
<param name="ITEM_CLASS" value="CL_RSR_WWW_ITEM_TEXT_ELEMENTS">
<param name="DATA_PROVIDER" value="Data_Provider">
<param name="GENERATE_CAPTION" value="">
<param name="GENERATE_LINKS" value="">
<param name="SHOW_FILTERS" value="">
<param name="SHOW_VARIABLES" value="">
<param name="ELEMENT_TYPE_1" value="COMMON">
<param name="ELEMENT_NAME_1" value="ROLLUPTIME">
<param name="ONLY_VALUES" value="X">
ITEM: TextElements_1
</object>
in bex analyzer, you can display the date with menu business explorer -> display text elements -> general, you will see some info, 'author'... 'last refreshed'.
hope this helps.
Similar Messages
-
Automatically trigger the event to load data from Planning cube to Standard Cube
Hello,
We have a below set up in our system..
1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
2. An actual reporting cube which gets data from the planning cube above.
Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
This involves 2 things..
1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
2. Trigger the DTP which loads data from Planning cube to reporting cube.
We want to automate the above two steps...
I have tried few things to achieve the same..
1. Created an event in SM64,
2. In the Planning cube "Manage" Screen, clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
3. Wrote a ABAP program which changes the setting of the planning cube ( " Change real time load behaviour " to Loading )
4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?hi,
try to do the transformation directly in the input cube by using CR of type exit, more details :
http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
hope it helps. -
I get the "unable to load data class information" why I try and sync
I get the "unable to load data class information" when I try and sync with my iPad. Tried uninstall and reset sync history, still doesn't work. Help.
Hey Mike Urwin
Start with the article below for troubleshooting syncing issues with Sync Services.
Advanced troubleshooting for Sync Services on Windows with Microsoft Outlook 2003, Outlook 2007, or Outlook 2010
http://support.apple.com/kb/TS2776
Regards,
-Norm G. -
Regarding the status of the report
hi can anybody tell me the way which we get the status of various SOrders and it should check some seven conditions and display the status , the following are the statuses to be displayed.
a. Mark as O (open) = Get the VBUK-LFSTK where VBAK-VBELN, If VBUK-LFSTK is A show the sales order status open
b. Mark as H (on Hold) = Pull the VBAK-VBELN reference to VBAK-LIFSK. If the sales order has the delivery block (15 IBM Pending BOP file )
c. Mark as D (Drop/Pull) = Get the LIKP-VBELN reference to VBAK-VBELN and give the LIKP-VBELN for VBUK- VBELN and if the VBUK- LVSTK is A then it is dropped.
d. Mark as N (pending) = Get LIKP-LIFSK from the LIKP-VBELN where VBAK-VBELN if the status is either 12, 13,14,15,16 and 17
e. Mark as F (Floor) = Get the LIKP-VBELN reference to VBAK-VBELN and give the LIKP-VBELN for VBUK- VBELN and if the VBUK- LVSTK is C then it is transferred to the floor.
f. Mark as K (Kitted) = Get the LIKP-VBELN reference to VBAK-VBELN and give the LIKP-VBELN for VBUK- VBELN and if the VBUK- LVSTK is B then it is kitted.
g. Mark as P (packed)= Get LIKP-VBELN reference VBAK-VBELN and give LIKP-VBELN as an input for VBUK- PKSTK and PKSTK isC
h. Mark as G (PGI) = Get LIKP-VBELN reference VBAK- VBELN and give LIKP-VBELN as an input for VBUK-WBSTK is C
Mark as E (EDI Error) in following two scenarios = If VBUK-LFGSK not equal to C for the sales order, then search VBAK-LIFSK then it is an EDI Error if values are: 14 or 17. On the other hand, If VBUK-LFGSK is equal C for the sales order, then search the delivery header LIKP-LIFSK it is an EDI Error if values are: 14 or 17hi Yj,
in web you can use web item 'text element' and choose element type 'general text' and element id ROLLUPTIME.
<object>
<param name="OWNER" value="SAP_BW">
<param name="CMD" value="GET_ITEM">
<param name="NAME" value="TextElements_1">
<param name="ITEM_CLASS" value="CL_RSR_WWW_ITEM_TEXT_ELEMENTS">
<param name="DATA_PROVIDER" value="Data_Provider">
<param name="GENERATE_CAPTION" value="">
<param name="GENERATE_LINKS" value="">
<param name="SHOW_FILTERS" value="">
<param name="SHOW_VARIABLES" value="">
<param name="ELEMENT_TYPE_1" value="COMMON">
<param name="ELEMENT_NAME_1" value="ROLLUPTIME">
<param name="ONLY_VALUES" value="X">
ITEM: TextElements_1
</object>
in bex analyzer, you can display the date with menu business explorer -> display text elements -> general, you will see some info, 'author'... 'last refreshed'.
hope this helps. -
How to add a new field in the cube and load data
Hi,
The requirement is
We have ZLOGISTICS cube , the data souce of this filed has REFDCONR-reference dcument number filed . We have to create a new field in cube load data and get this new filed into the report also.
Please any one can help me with the step by step process of how to do?
How to get the data into BW and into the report.Hi,
So you need that this new field have data in old records?
1.- If you are in BI 7.0 and the logic or data for that New field are in the same Dimension, you can use a Remodeling to fill it. I mean if you want if you want to load from a Master Data from other InfoObject in the same Dim.
2.- If condition "1" is not yours.
First add the new field, then create a Backup Cube (both cubes with the new field) and make a full update with all information in the original Cube. The new field willl be empty in both cubes.
Create an UR from BackUp_Cube to Original_Cube with all direct mapping and create a logic in the Start Routine of the UR (modiying the data_package) you can look for the data in the DSO that you often use to load.
To do that both cubes have to be Datasources ( right click on Cube-> aditional function-> and I think is "Extract Datasource")
Hope it helps. Regards, Federico -
What is the different to load date from SBIW and LO(lbwe)
Hi all experts,
I am wondering if i need to data to generate reports(infostructure S001) in BW. Can i just only load data in "business content datasources"(SBIW) ->select S001 OR I have to load data from LO.
What is the different between two (SBIW,LBWE)
Thank you
KoalaHi Koala,
After SBIW tcode execution you have a screen where another tcodes are accesible. Including LBWE. Namely, in SBIW screen
'Data transfer to the SAP Business Information Warehouse/Settings for Application-Specific Datasources (PI)/Logistics/Managing Extract structures/Logistics Extraction Structures Customizing Cockpit' is LBWE tcode!
As I already have answered to you
anyone experienced data load for S001?
You need to choose between LIS extraction and LO extraction, not between SBIW and LBWE.
BTW, managing LIS setup is situated in SBIW also:
...Logistics/Managing transfer information structures/Application-Specific setup of statistical data.
Best regards,
Eugene -
How to retrive the new reocrds which are posted at the time of loading data
hi experts
i have a doubt
if we are performing a load operation at the time of loading if client posts some new records related to the current load is those records will be transferred to the target with the current load or we have to load them at the time of delta load, what will be the problems occurs at this situation
also we are having an option in RSA3 ---BLOCKED ORDERS is it going to helpful for this situation
also i found some answer like at the time of loading we need to lock the base tables so that the new data will be blocked is it the solution for the above scenario
thanks in advanceHi Lokesh,
Not clear if you are referring to posting of records during an initialization activity or normal delta, full loads. In case of an initialisation for a LO Cockpit datasource you cannot allow any postings to be done in the source (ECC) system. In case of normal delta, full loads, the changes are stored in tables/extract structures/delta queues and are not affected by the changes in the source system during that time. The changes done during that time are captured in the next delta run.
Hope this helps!
You may want to refer to blogs from Roberto on the extraction methods and their operations.
Regards,
Kunal Gandhi -
Getting short dump at the time of loading data from R/3 to ODS
Hi BW Grurus,
I am trying to load data from R/3 to ODS, but after running for a few minutes it is getting into the short dump and displays the following run time error. So please give me solution how I can load data without getting short dump. I tried thrice but it is giving the same and failed.
Run time error : TSV_TNEW_PAGE_ALLOC_FAILEDHi,
Check, is start routine or individual routine in present in update/transfer rule?
May be read large amount data (select * from) another ODS and put into internal table cause these type of error.
Regards,
Saran -
Regarding Short Dump While loading data from DB Connect
Dear All,
We are having an issue of getting short dump while loading data from DB Connect to BW. We were able to load the data into BW Dev using the same data source without any problem. Whereas in Production, I am getting the following error:
Runtime Error PERFORM_CONFLICT_TAB_TYPE
Except. CX_SY_DYN_CALL_ILLEGAL_TYPE
What could be the reason for the error that I am gettinghi,
Refer Note 707986 - Writing in trans. InfoCubes: PERFORM_CONFLICT_TAB_TYPE
Summary
Symptom
When data is written to a transactional InfoCube, the termination PERFORM_CONFLICT_TAB_TYPE occurs. The short dump lists the following reasons for the termination:
("X") The row types of the two tables are incompatible.
("X") The table keys of the two tables do not correspond.
Other terms
transactional InfoCube, SEM, BPS, BPS0, APO
Reason and Prerequisites
The error is caused by an intensified type check in the ABAP runtime environment.
Solution
Workaround for BW 3.0B (SP16-19), BW 3.1 (SP10-13)
Apply the attached correction instructions.
BW 3.0B
Import Support Package 20 for 3.0B (BW3.0B Patch20 or SAPKW30B20) into your BW system. The Support Package is available oncenote 0647752 with the short text "SAPBWNews BW3.0B Support Package 20", which describes this Support Package in more detail, has been released for customers.
BW 3.10 Content
Import Support Package 14 for 3.10 (BW3. 10 Patch14 or SAPKW31014) into your BW system. The Support Package is available once note 0601051 with the short text "SAPBWNews BW 3.1 Content Support Package 14" has been released for customers.
BW3.50
Import Support Package 03 for 3.5 (BW3.50 Patch03 or SAPKW35003) into your BW system. The Support Package is available once note 0693363 with the short text "SAPBWNews BW 3.5 Support Package 03", which describes this Support Package in more detail, has been released for customers.
The notes specified may already be available to provide advance information before the Support Package is released. However, in this case, the short text still contains the term "Preliminary version" in this case.
Header Data
Release Status: Released for Customer
Released on: 18.02.2004 08:11:39
Priority: Correction with medium priority
Category: Program error
Primary Component: BW-BEX-OT-DBIF Interface to Database
Secondary Components: FIN-SEM-BPS Business Planning and Simulation
Releases
Software
Component Release From
Release To
Release And
subsequent
SAP_BW 30 30B 30B
SAP_BW 310 310 310
SAP_BW 35 350 350
Support Packages
Support
Packages Release Package
Name
SAP_BW_VIRTUAL_COMP 30B SAPK-30B20INVCBWTECH
Related Notes
693363 - SAPBWNews BW SP03 NW'04 Stack 03 RIN
647752 - SAPBWNews BW 3.0B Support Package 20
601051 - SAPBWNews BW 3.1 Content Support Package 14
Corrections Instructions
Correction
Instruction Valid
from Valid
to Software
Component Ref.
Correction Last
Modifcation
301776 30B 350 SAP_BW J19K013852 18.02.2004 08:03:33
Attributes
Attribute Value
weitere Komponenten 0000031199
Thanks
(Activate ODS/Cube and Transfer rules again..) -
Invalid Data Status while loading data in planning book
Hi,
Recently i got the error message "Invalid Data Status - Error reading data - planning book cannot be processed further ". I was able to resolve that issue by running program /SAPAPO/TS_LCM_CONS_CHECK with options Repair and check Livecache anchor. Now Here is my question, i want to know why does the livecache anchor error occurred. What steps can i take in future to avoid this error?
Regards
KartikKartik,
I couldn't say. In all the years I have been using SCM, I have never been able to prevent all such errors from occurring, I have only been able to solve certain problems only under certain specific instances. Usually, the 'solvable; problems I have seen are a result of poorly written enhancments or custom programs.
One might speculate that the reason SAP provides these repair programs (yes, there are MANY inconsistency-repair type programs) because they believe that such errors are inevitable.
Maybe you would be better served by just running such programs periocically in batch, as recommended by SAP.
http://service.sap.com/~sapidb/011000358700000955412003E
Best Regards,
DB49 -
How to filter the records in the process of loading data into ODS
Hi All,
I am doing some data extraction from R/3 system. But I need to ignore(don't need to load) the records with a date field (Purchase Order date) that is blank.
So how can I filter out the records with the blank date field?
Some one suggested me to write some select statement (looping the data packages) in the start routine but I am not that experianced with the ABAP routines. So can anyone provide some sample format of the routine or can you suggest me some other way to filter out this data.Ram Kumar,
I used this in the start routine but it gives me an error saying ...
No component exists with the name "TRAN_STRUCTURE-BEDAT"
<b>delete datapak where TRAN_STRUCTURE-BEDAT = '' OR
TRAN_STRUCTURE-BADAT = '' OR
TRAN_STRUCTURE-ERDAT = '' OR
TRAN_STRUCTURE-FRGDT = '' OR
TRAN_STRUCTURE-UDATE = ''.</b>
I have taken out the TRAN_STRUCTURE part and just typed
delete datapak where BEDAT = '' OR
BADAT = '' OR
ERDAT = '' OR
FRGDT = '' OR
UDATE = ''.
This above statement has no syntax errors. but this statement doesn't filter out the records with a the date fields that are blank.
Please help me out here.
This is kind of urgent.
Thanks, -
Regarding short dump while loading data
Hi All,
I am loading data using 0FI_GL_4. This is a periodic load which was working fine till March. There are lot of postings that happened in April and period was open till 20th. Now when I am trying to load that for 12th period April. I am getting the following error "ABAP/4 processor: DBIF_RSQL_SQL_ERROR" in short dump.
This is the error analysis:
An exception occurred. This exception is dealt with in more detail below
. The exception, which is assigned to the class 'CX_SY_OPEN_SQL_DB', was
neither
caught nor passed along using a RAISING clause, in the procedure
"BWFIR_READ_BSEG_CPUDT_DATA" "(FUNCTION)"
I am tried to load data 5 times till now and every time I am getting the same error. I checked with Basis people for the DB space and it is fine.
Why I am suddenly getting this error???Hi,
it looks like an internal table overflow in your source system.
Try reducing the data_package size of your load or ask your basis guys to have a look on short dump and perhaps increase this memory.
Try RSA3 in the source system and see if it works with your data_package size values from BW and see if it works. If not reduce the number of records...
On which plugin release is your system running?
hope this helps,
Oliviier. -
Regarding the Time dependent master data error
Hi Masters,
I've loaded the time dependent master data from flat file only two fields. When I checked that corresponding characteristics in maintain master data, that contains by default from date 01.01.2007and to date 31.12.9999 this date also. Could you please help me to rectify this error.
Thanks in advance
Raja.SHi Antonino La Vela
I have 2 Project Manger and in different duration for different project.
Following datas are my data in Excel sheet.
PM Name To date From Date Costcenter
Ragunath 01.09.2007 01.06.2006 Project name1
Ramana mani 01.02.2008 02.09.2007 Project name2
while loading above data, I'm getting following data in maintain master data
PM name To Date From Date Costcenter
31.12.9999 01.01.1000
Ragunath 31.05.2007 01.01.1000
Ragunath 01.09.2007 01.06.2007 Project Name1
Ragunath 31.12.9999 02.09.2007
Ramana mani 01.09.2007 01.01.1000
Raman mani 01.02.2008 02.09.2007 Project Name2
Raman mani 31.12.9999 02.02.2008
Could you please help me, how this unnecessary datas are loaded by default?
Thanks in Advance
Raja.S -
Need help regarding the maximum limit for data type
Hi,
this is Sravan. for my application am inserting the bulk data in XML format into a column of Database table.
the sample inserted XML data will be in format... like
'<ACC count = "10">
<Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
<Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
<Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
<Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
<Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
<Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
<Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
<Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
<Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
<Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
</ACC>'
this data in XML can have more than 1000 accounts.
now, i need to take a Parameter value from XML node and write into a file. for this i have written a procedure by looping the Nodes from XML data and building Dynamic Query like..
if nvl(v_int, 0) > 0 then
v_sql := '';
for v_count in 1..v_int
loop
if v_sql is null then
v_sql := 'select extractvalue(empdetails, ''/ACC/Acc' || v_count ||'/@ac_no'')||extractvalue(empdetails, ''/ACC/Acc' || v_count
||'/@cn_nr'')||extractvalue(empdetails, ''/ACC/Acc' || v_count ||'/@eflag'') string from sample1';
elsif v_sql is not null then
v_sql := v_sql || ' union all select extractvalue(empdetails, ''/ACC/Acc' || v_count ||'/@ac_no'')||extractvalue(empdetails, ''/ACC/Acc' || v_count
||'/@cn_nr'')||extractvalue(empdetails, ''/ACC/Acc' || v_count ||'/@eflag'') string from sample1';
end if;
end loop;
end if;
i will get this variable "v_int" from <ACC count = "10"> ACC count Attribute, here 10 is the value for v_int variable but in real time the variable value can be more than 1000.
at the time of Building Dynamic Query I make using the Variable v_SQL which is of Data Type "Long",
this variable stores dynamic query which is build at the time of executing procedure but this v_sql Variable is throughing Exception that ....
"numeric or value error Character String Buffer is Too Small"... but Long Data type will store upto 2GB..
this variable cant able to hold much data which is dynamically build.
will you please help me to resolve this issue Or will u suggest me another method to pick the data from XML column.
thanks,
sravan.user11176969 wrote:
i changed the code, now its working fine.
direct assigning the dynamic query to a Clob variable raised error.
for dynamic query building i used another variable and assigned this variable value to actual query variable.Nice! -
Regarding the downloading amount of data through export to PDF in WAD
hi in our WAD we are using export to PDF option to download the data. since we have more than 300 pages in our report the datas are not getting downloaded. if we restrict to some state(i.e., say some 30 pages ) then it is working fine.
is there any setting to overcome this problem.
i need to download all the datas no matter how many pages it is.
do anyone have ideas regarding thisHi Dolly,
Hi
Export to PDF is assigned to some Web Item like Analysis or Chart
Go to the Analysis properties and there you have to align the number of rows and columns per page
Hope this helps u..
Best Regards,
VVenkat..
Maybe you are looking for
-
How to connect my computer to my tv HD?
What wires do i need to connect my macbook pro 13 in to my HD TV. is there any way i can hook it up, having it shown in HD?
-
Deployement order of MDBs and SSBs
Hi All, Facing the issue with deployment order of EJBs in one ear, MDBs start reading and processing message before SSBs are initialised in JNDI while server startup. And JNDI lookup for SSB from MDB failes JNDI lookup exception gets raised, failing
-
I deleted a text message from a number I forgot to save. Is there a way to find this message again ?
-
Hi I had a BB Z10 with the Password Keeper App installed. I changed phones to a non BB and took the sim card ut of my old BB to use in my new phone. When I turned my old BB back on to access some information on the device I noticed that Password Kee
-
Hi, When Executing LL01 T.Code for Warehouse monitor, I get qty Zero for all the parameters date & time as two years date.(Say If I execute on 27th Jan 2009, the report gives to 12.03.2007 14:18:41) How to make this to current date? For another WH N