Check data load or not in subtype of table type
Hi friends,
Using global collection variable vt_product_tab.
created record tr_product_tab.
type tt_product_tab is table of tr_product_tab index by binary integer.
vt_product_tab tr_product_tab;
when the package loaded first time data loded into vt_product_tab. Now I would like to write a procedute to check data loded or not into vt_product_tab.
Venkat.
Hi,
have a managed bean that has a JSF binding reference (table property) for th etable. Then on the table you can call getSelecteRowKeys, which returns a rowk key set
Frank
Similar Messages
-
Data loads is not picking up all records
Hello, wonder if anyone ever encountered the following problem. I've been trying to load a flat file to BPC and for a year now, we've had no issues. This time however, it seems that not all records were loaded, eventhough the status indicated the correct number of rows were successfully loaded. Any ideas what may be causing this? I've checked the conversion files, account dimension and nothing has changed. Any help with this is greatly appreciated. Ideas on how / where to check for potential trouble spots are all welcome.
Thanks in advance.
DavidHello,
If you are using the standard SSIS package import, the rejected record should be reported into the log file at the end. However, in order to see if there are rejected records or not, you can try to validate your transformation file using the same import file.
You can also verify also the temporary files generated into D:\BPC\Data\Webfolders\APSHELL_Sorin\Finance\PrivatePublications. In this way you can find the right calues imported into the cube.
How you check if data is loaded or not?
Best regards,
Mihaela -
Master Data load does not extract Hierarchy nodes in BPC Dimension ACCOUNT
Hi Experts,
I am performing master data load through standard DM package with Filter selection as:
1. Chart of Accounts
2. Hieararchy selection has 4 hierarchy names
3. Selected Import Text nodes
4. Selected Set Filters by Attribute OR Hierarchies
I have run this DM package for a set of data and selections a week ago and it worked fine.
However when i run it now, it is giving issues,
It extracts any new GL maintained in the BI system however it does not extract any hierarchy nodes at all! (Have tested this by deleting the hierarchy nodes and tried to run the master data load)
I am running the DM package in Update and have selection as External.
Any sugestions for checks / has anyone encountered this issue earlier?
Regards,
Shweta SalpeHi Guyz,
Thanks.
I found that the issue was with the transformation file where i was maintaining the RATETYPE.
When i removed the mapping of RATETYPE this works fine. (Pulls the nodes of hierarchies)
however now i do not have Ratetype populated in the system.
my rate type mapping is:
RATETYPE=*IF(ID(1:1)=*STR(C) then *STR(TOSKIP);ID(1:1)=*STR(H) then *STR(TOSKIP);ID)
and in conversion file i have TOSKIP *skip
I have to skip the ratetypes for the hierarchy nodes and my hierarchy nodes start with C and H.
So now that i have removed the mapping for RATETYPE can anyone suggest me a correct way to achieve this? (Note the above mapping formula was skipping all of the hierarchy nodes starting with C and H)
Regards,
Shweta Salpe -
Master data loads Request not updated to any data target using delta
When checking the PSA the request updated ICON has red triangle and the mouse over says
"Request not updated to any data target using delta".
I am doing full loads for text data and both full and delta's loads for Attributes. This is BI 7.0 system but the loads are still from 3.1. DTP has not been implemented yet. The system was just upgraded in July. I am unable to schedule deletes from the PSA for successful loads. However, I think the data is updating to the info objects. My Info package has the selection to PSA then in the InfoObject package by package.
How do I schedule the deletes from PSA and why does the Request updated show red but the monitor for the info package show green?
Edited by: Joe Mallorey on Jan 27, 2009 5:46 PMHi shikha,
The load has not failed but I am unable to delete the load from the PSA.
If you do a manage on the Data Source or go to PSA from RSA1 the first column has the green gear icon instead of a green check mark I have red triangle the mouse over says "request not updated to any data target using delta" The data has loaded to info object. I am trying to schedule deletes from the PSA and using the option to "delete only successfully booked/updated requests" So how do I get the request updated column to show a green check mark so my deletes will Process? This is for master data only. My transactions load fine and delete properly according to my settings.
Thanks for the reply.
Regards,
JoeM -
Master Data Loaded but not able to see.
Loaded master data successfully. Able to see it in P and M table, but not able to see when going thru Manage - Contents at InfoProvider level.
Many requests are available showing number of records Transferred and Updated.
What could be the possible reason?Hi Sarabjit,
Welcome to SDN!!
You need to load the transaction data into the Info-provider to see the contents and not just the master data.
In case you have done the transaction load then you need to do the "apply hier/attr change" for the master data loaded. Goto RSA1 -> Tools(menu) -> apply hier/attr change -> Seelect your info-object and execute.
Bye
Dinesh -
Check data load performance for DSO
Hi,
Please can any one provide the detials, to check the data load performance for perticular DSO.
Like how much time it took to load perticular (e.g 200000) records in DSO from R/3 system. The DSO data flow is in BW 3.x version.
Thanks,
Manjunatha.Hi Manju,
You can take help of BW statistics and its standard content.
Regards,
Rambabu -
Data Load Wizard not Inserting/Updating all rows
Hello,
I am able to run through the whole Data Load Wizard without any problems. It reports that it successfully inserted/updated all the rows, but when I look in the table, I find a few rows that were not updated correctly. Of the entries I've identified that don't get inserted/updated properly, I've noticed they are the same rows that I was having issues with earlier. The issue was a number format error, which I solved by providing an explicit number format.
Is it possible that the false inserts/updates might still be tied to the number format, or are there other reasons why the data load is failing on only some rows.
Thanks,
Brian
Edited by: 881159 on Mar 14, 2012 5:05 PMHi Brian,
I am not aware of the situation where you get false results. However, there were some issues with number/date formats that sometime were not properly parsed, and this has been fixed in 4.1.1 patch. would your case be different from the one described in bug 13656397, I will be happy to get more details so that I can take a look at what is going on.
Regards,
Patrick -
Bom item data - Document assignment not getting saved in table STPO
Dear All,
As per out customer requirement, they want to display assigned drawing documents of Bom items in CS11. But it is not reflecting in cs11 eventhough we have assigned documents in Bom item detail overview Document assignment tab page. Moreover this is not getting saved in table STPO.
Waiting for solutions
Thanks & Regards
Dhananjay KulkarniPlease understand my customers requirement,
Presently they are assigning drawing document of each mat in material master. And if they want to see the drawing of each component of bom through Bom maintenance then they double click on component in bom, goes to mat master then additional data then document data and then drawing. In this way they need to open minimim 5 windows, which they want to make minimum 2 windows.
In bom they are having more than 300 components. This is not possible to define document as component with item category D of so many components. One more idea I given to them is to club all drawings in one document and define that document as component with item category D, but in this way they can't identify the individual drawing with part name.
One more idea I given to them is to assign drawing document of individual component in item overview - Document assignment tab page from where they can open the drawing but it needs to open minimum 3 windows. So they are ready to assign drawing document in item overview - Document assignment tab page but they want, it should reflect in CS11 document column, from where they can easily open the drawing document.
One more thing I observed that, assigned document to item overview - Document assignment tab page is not getting saved in Table STPO. I can't see this field content in STPO, which may be the reason that it is not reflecting in CS11.
Is anything missing from Document management system?
Please think and reply friends it's Urgent.
Thanks & Warm Regards
Dhananjay Kulkarni -
Display the date witch is not exist in my table
Dear Sir
I want to get the date which is not exist in this field C_DATE, when I compare between tow date, that is I let user enter from_date and to_date ,to get the date which is not exist in the CHEMICAL_CONSUMPTION table ,
And my table structure as the flowing:
SQL> DESC CHEMICAL_CONSUMPTION
Name Null? Type
C_DATE DATE
C_JOB_NO NUMBER(8)
C_BLOCK_TYPE VARCHAR2(10)
C_COLOR_TYPE VARCHAR2(10)
C_RUN_TIME NUMBER(6,2)
C_OPER_CODE VARCHAR2(10)
REMARKS VARCHAR2(450)
C_YEAR NUMBER(4)
C_DAY_REMARKS VARCHAR2(2000)
ORDER_CODE NUMBER(12)
RRQ_DOC_CODE NUMBER(12)
Can I compare this table with the dual table to display the date witch is not in CHEMICAL_CONSUMPTION .
Waiting for your valuable answer with example
Best regards
Jamil Alshaibanisomething like
SELECT *
FROM CHEMICAL_CONSUMPTION
WHERE C_DATE BETWEEN (:BLOCK.FROMDATE AND :BLOCK.TODATE)
OR C_DATE IS NULL;Just a guess. -
Data load warning, not yet completed
Hello, i am trying to load data from view table using Generic DS for master data.
i have a characteristics Infoobject where i Assigned DS, and created infopack.
When scheduled load ,it gives request proceessed message, when i go to monitor it gives following message:
Request still running
Diagnosis
No errors could be found. The current process has probably not finished yet.
System response
The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
and/or
the maximum wait time for this request has not yet run out
and/or
the batch job in the source system has not yet ended.
Current status
No Idocs arrived from the source system.
My data target is Charact INfoobjt, can anyone tell me wht should i do?thanks
dkHi,
Check R/3 SM58 and BD87 for pending tRFCs and IDOCS and execute them if required.
Transact RFC error
tRFC Error - status running Yellow for long time (Transact RFC will be enabled in Status tab in RSMO).
Step 1: Goto Details, Status get the IDoc number,and go to BD87 in R/3,place the cursor in the RED IDoc entroes in tRFC
queue thats under outbound processing and click on display the IDOC which is on the menu bar.
Step 2: In the next screen click on Display tRFC calls (will take you to SM58 particular TRFC call)
place the cursor on the particular Transaction ID and go to EDIT in the menu bar --> press 'Execute LUW'
(Display tRFC calls (will take you to SM58 particular TRFC call) ---> select the TrasnID ---> EDIT ---> Execute LUW)
Rather than going to SM58 and executing LUW directly it is safer to go through BD87 giving the IDOC name as it will take you
to the particular TRFC request for that Idoc.
OR
Go into the JOB Overview of the Load there you should be able to find the Data Package ID.
(For this in RSMO Screen> Environment> there is a option for Job overview.)
This Data Package TID is Transaction ID in SM58.
OR
SM58 > Give * / user name or background (Aleremote) user name and execute.It will show you all the pending TRFC with
Transaction ID.
In the Status Text column you can see two status
Transation Recorded and Transaction Executing
Don't disturb, if the status is second one Transaction Executing. If the status is first one (Transation Recorded) manually
execute the "Execute LUWs"
OR
Directly go to SM58 > Give * / user name or background (Aleremote) user name and execute. It will show TRFCs to be executed
for that user. Find the particular TRFC (SM37 > Req name > TID from data packet with sysfail).select the TrasnID (SM58) --->
EDIT ---> Execute LUW
Process IDOCS Manually
IDOCS Process Manually
Non-updated Idocs found in Source System
http://help.sap.com/saphelp_nw04s/helpdata/en/0b/2a6620507d11d18ee90000e8366fc2/frameset.htm
http://help.sap.com/saphelp_nw04/helpdata/en/dc/6b815e43d711d1893e0000e8323c4f/content.htm
Re: mannualy process idoc
Non-updated Idocs found in Source System&messageID=2311994
regarding idocs
regarding idocs -
Data Load from XML file to Oracle Table
Hi,
I am trying to load data from XML file to Oracle table using DBMS_XMLStore utility.I have performed the prerequisites like creating the directory from APPS user, grant read/write to directory, placing the data file on folder on apps tier, created a procedure ‘insertXML’ to load the data based on metalink note (Note ID: 396573.1 How to Insert XML by passing a file Instead of using Embedded XML). I am running the procedure thru below anonymous block to insert the data in the table.
Anonymous block
declare
begin
insertXML('XMLDIR', 'results.xml', 'employee_results');
end;
I am getting below error after running the anonymous block.
Error : ORA-22288: file or LOB operation FILEOPEN failed”
Cause : The operation attempted on the file or LOB failed.
Action: See the next error message in the error stack for more detailed
information. Also, verify that the file or LOB exists and that
the necessary privileges are set for the specified operation. If
the error still persists, report the error to the DBA.
I searched this error on metalink and found DOC ID 1556652.1 . I Ran the script provided in the document. PFA the script.
Also, attaching a document that list down the steps that I have followed.
Please check and let me know if I am missing something in the process. Please help to get this resolve.
Regards,
SankalpThanks Bashar for your prompt response.
I ran the insert statement but encountered error,below are the error details. statement.
Error report -
SQL Error: ORA-22288: file or LOB operation FILEOPEN failed
No such file or directory
ORA-06512: at "SYS.XMLTYPE", line 296
ORA-06512: at line 1
22288. 00000 - "file or LOB operation %s failed\n%s"
*Cause: The operation attempted on the file or LOB failed.
*Action: See the next error message in the error stack for more detailed
information. Also, verify that the file or LOB exists and that
the necessary privileges are set for the specified operation. If
the error still persists, report the error to the DBA.
INSERT statement I ran
INSERT INTO employee_results (USERNAME,FIRSTNAME,LASTNAME,STATUS)
SELECT *
FROM XMLTABLE('/Results/Users/User'
PASSING XMLTYPE(BFILENAME('XMLDIR', 'results.xml'),
NLS_CHARSET_ID('CHAR_CS'))
COLUMNS USERNAME NUMBER(4) PATH 'USERNAME',
FIRSTNAME VARCHAR2(10) PATH 'FIRSTNAME',
LASTNAME NUMBER(7,2) PATH 'LASTNAME',
STATUS VARCHAR2(14) PATH 'STATUS'
Regards,
Sankalp -
Data Load from Form's Column to Table's Row!
Hi All,
I face a problem & want to share with you people...
I have created a Table named "Employee Pay" having columns EmpId,PayCode,PayAmount.
I have designed a form for respective Table like below :
EmpId Basic HRA
1 100 50
2 150 80
When I want to add or update record in above form data should be filled in Table like below:
EmpId PayCode PayAmount
1 Basic 100
1 HRA 50
2 Basic 150
2 HRA 80
How to do it?
Thanks & Regards,
Nabanita
Edited by: ghosh.nabanita on Oct 19, 2011 10:29 AMHello,
It seems that your Cube doesn't accepts lower case letters. You can change this setting in RSKC to accept all the characters. The other way is to write a small code in the update rules or the transformation which will convert all the incoming data to upper case and then push it to the Cube.
Do assign points if it helps.
Thanks
Rishi -
Data Package 000001 : sent, not arrived. No data load in BW 7
Hi gurus...i am working with BW 7 and i have already had the complete data flow from SRM to BW, some Business Content extractors and somo others generic extractors. We have created transformations, info packages and DTP's, in fact we have loaded master data and transactional data but since last monday data load is not working any more, now every time we manually run the infopackage the next waring message appears:
Data Package 000001 : sent, not arrived
and the data load just wait for an answer of SRM (with is the source system) or SRM waits a request from BW.
We have reactivated the extractors in the SBIW, replicated and reactivated in BW (also we reactivate the transformations) and the data load is not working. Then we also regenerate the Data source but it does not work.
Do you have and idea of what is happening or which could be the problem.
ThanksCheck in transaction SMQA in the source system. The communication might not be working. If there are any entries in RED, then select execute LUWs from the menu to manually process them.
Read the error and try to resolve it before though. -
Hello Guru's,
I am wroking on BI 3.5. i want check data load status in infocube and ODS, where i should check data load failure and
success in details,
Data is uploaded by data requests and i want check which requested is failed.
Please help me how to create master data sources in r/3
Regards
Shivaraj
Edited by: Shvai Patil on Feb 1, 2008 4:45 AMHi,
You can monitor all the requests in RSMO and also check in sm37.
For creating custom master data in R/3 you can use RSO2 tcode.
Hope it helps you.
Let us know if still have any issues.
Reg
Pra -
Data loader : Import -- creating duplicate records ?
Hi all,
does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
Duplicate Checking Method == External Unique ID
Action Taken if Duplicate Found == Overwrite Existing Records
but data loader have created new records where the "External Unique ID" is already existent..
Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
thanks in advance, Juergen
Edited by: 791265 on 27.08.2010 07:25
Edited by: 791265 on 27.08.2010 07:26Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
You should review all documentation on Oracle Data Loader On Demand before using it.
These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
Pete
Maybe you are looking for
-
I have registered for up-to-date program. However the content code I received was said as not recognised valid code during redemption. What can I do?
-
InfoSource 8ODS_SLS is not defined in the source system
Hi Guys, When I tried to load data from ODS to Infocube, I am getting the following error. InfoSource 8ODS_SLS is not defined in the source system Source System is not R/3. It is just a flat file and data was uploaded into ODS. When trans
-
When is the next firmware released after v21.0.016...
When is the next FW upgrade after V21.0.016 which hav known bugs reported . I hope nokia to rectify it in next FW near soon.....My N95-1 getting slower & slower in response...when boot up it takes 15 minutes, some phone pre-installed function like No
-
My DVD player is not working with any type of DVD. It comes up with the error: Valid Device Not Found For Playback (-10017). VLC is also having issues even after updates and will play audio but not picture.
-
Import and new operations using cfimport
This question was posted in response to the following article: http://help.adobe.com/en_US/ColdFusion/9.0/Developing/WS61C07B60-3D65-4d71-8F2A-8411D8010E 60.html