Error while extracting data into 0GL_ACCOUNT
Hi all,
Initialize delta Process has been done successfully for extracting data into 0GL_ACCOUNT using DataSource 0GL_ACCOUNT_ATTR.
However when using Delta Update for extracting data, the following error is displayed: 'ALE change pointers are not set up correctly' & is asking to activate the change pointer in BD61(Source System).
I am not exactly sure of what i am supposed to do in BD61. Plz help.
Regards,
Srikar
Hi Sushant,
Thanks for the reply.
I have checked in RSA2 for DataSource 0GL_ACCOUNT_ATTR.
All the Entries are present; However the status of SAKAN is not successful(Criss-Cross Shape is displayed).
what am i supposed to do?
Regards,
Srikar
Similar Messages
-
Error while extracting data from data source 0RT_PA_TRAN_CONTROL, in RSA7
Hi Gurs,
I'm getting the below error while extracting data from data source 0RT_PA_TRAN_CONTROL, in RSA7. (Actullly this is IS Retail datasource used to push POSDM data into BI cubes)
The error is:
Update mode "Full Upload" is not supported by the extraction API
Message no. R3011
Diagnosis
The application program for the extraction of the data was called using update mode "Full Upload". However, this is not supported by the InfoSource.
System Response
The data extraction is terminated.
Procedure
Check for relevant OSS Notes, or send a problem message of your own.
Your help in this regd. would be highly appreciated.
Thanks,
David.Hi David,
I have no experience with IS Retail data sources. But as message clearly say this DS is not suppose to be ran in Full mode.
Try to switch you DTPs/Infopackages to Delta mode.
While to checking extraction in source system, within TA RSA3 = Extractor checker, kindly switch Update mode field to Delta.
BR
m./ -
Error While extracting data from FIAA and FIAP
Hi Gurus,
I am facing error while extracting data from R/3 Source. 0FIAA, 0FIAP datasources. I am getting the same error repeatedly. No data will come BW. When I checked in RSA3 I can extract records.
Ther Error is as follows:
Request still running
Diagnosis
No errors could be found. The current process has probably not finished yet.
System response
The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
and/or
the maximum wait time for this request has not yet run out
and/or
the batch job in the source system has not yet ended.
Current status
No Idocs arrived from the source system.
Kindly help. Your points are assured.
Thanks and Regards
PrasadHello Prasad,
Have you already checked what happened in the source system ?
You should verify if a job is running in sm37, or if there is no dump runtime errors due to the extraction in st22.
It could be a clue of what happened in R/3.
Let us know,
Regards,
Mickael -
Time Stamp Error while extracting data from R/3
Hi,
We are getting time stamp error while extracting data from R/3.
To solve this problem we did replication and run RS_TRANSTRU_ACTIVATE_ALL program. still we are facing same problem.
Please suggest me to solve.
Thanks
Subba RaoHi,
Time stamp error arises when the time stamp of the data source in source system and target system are different.
For we have again activate data source in R/3 system using transaction RSA5 or RSA6 and in BI system goto transaction RSDS and replicate the data source.
You can also find time stamp details for a data source in tables ROOSGEN and ROOSOURCE tables in BI and R/3 system respectively.
Here are some useful links.
[R3 016 Time stamp error where is it in BI?;
[time stamp error in bi7;
[Timestamp error in BI7;
Thanks,
Venu -
Error while extracting data from Generic datasource
Hello Gurus,
I have encountered an error "Errors in source system" with status red while extracting data from generic datasource into DSO.
I have done below things:-
1. Checked the jobs in source system, the job is successfully completed
2. The generic datasource is active and supporting delta with calender day in source system.
3. No TRfc error and no idocs stucked.
4. No short dumps in source system.
5. The delta queue(RSA7) is showing 0 with status as green.
6. Replicated the datasource On BW side and also activated the transfer rules by RS_TRANSU_ACTIVATE_ALL
The problem has not yet solved. Could you guys please suggest me what might be the reason for this error......
Thanks,
SonuHello,
I have used the generic delta using calender day.
Safety upper limit : - 1
Safety lowerr limit : - 0
I have made the entries in va01 transaction, but the delta queus is not getting updated which should show 1.
Do Generic extraction require any job to transfer data from Database table to delta queue.
We are not able to get the delta records on BW side. Please suggest.
Please suggest.
Thanks,
Sonu -
Runtime Error while extracting data by datasource based on function module
Hi all,
I am facing an issue while extracting data from a customised data source based on a new function module.
The datasource is extracting data successfully for only 15000 records but after that the runtime error is displayed.
I am not able to extract whole data from R/3 system.
Please take a look into the details and tell me what should I have to do.
Details of Issue:
Runtime Error : GETWA_NOT_ASSIGNED
What happened?
Error in the ABAP Application Program
The current ABAP program "SAPLZ_99Z_BW_SD_PRICING"had to be terminated because it has come across a statement that unfortunately cannot be executed.
Error analysis
You attempted to access an unassigned field symbol
(data segment 32790).
This error may occur if
- You address a typed field symbol before it has been set with
ASSIGN
- You address a field symbol that pointed to the line of an
internal table that was deleted
- You address a field symbol that was previously reset using
UNASSIGN or that pointed to a local field that no
longer exists
- You address a global function interface, although the
respective function module is not active - that is, is
not in the list of active calls. The list of active calls
can be taken from this short dump.
Edited by: anshu13 on Apr 27, 2010 10:28 AMThe code is displayed here:
error is in line no. 625
Source Code Extract
Line SourceCde
595 <fs_fldval> = l_fieldval.
596 APPEND <fs_dyntable> TO <it_dyntable>.
597 CLEAR :l_fieldval, wa_fldcat.
598 CLEAR l_totlength.
599 CLEAR <fs_dyntable>.
600 ENDIF.
601 ENDIF.
602 MOVE-CORRESPONDING <dd03l_fields> TO wa_dd03l.
603 IF <dd03l_fields>-datatype EQ 'DATS'.
604 l_fieldval = 'X'.
605 ELSE.
606 SHIFT <dd03l_fields>-intlen LEFT DELETING LEADING '0'.
607 IF l_totlength IS INITIAL.
608 SHIFT l_totlength LEFT DELETING LEADING '0'.
609 l_totlength = '0'.
610 ENDIF.
611 l_currlength = <dd03l_fields>-intlen.
612 CONCATENATE l_totlength '(' l_currlength ') ' INTO l_fieldval .
613 l_totlength = l_totlength + l_currlength.
614 ENDIF.
615*** Consider both field name and domian name for checking in range list:
616 IF <dd03l_fields>-fieldname IN s_fieldlist.
617 wa_fldcat-fieldname = <dd03l_fields>-fieldname.
618 ELSEIF <dd03l_fields>-domname IN s_fieldlist.
619 wa_fldcat-fieldname = <dd03l_fields>-domname.
620 ELSE.
621 wa_fldcat-fieldname = <dd03l_fields>-fieldname.
622 ENDIF.
623 ASSIGN COMPONENT wa_fldcat-fieldname
624 OF STRUCTURE <fs_dyntable> TO <fs_fldval>.
>>>>| <fsfldval> = l_fieldval._
626 CLEAR l_fieldval.
627 CLEAR l_currlength.
628 ELSE.
629 IF <dd03l_fields>-datatype EQ 'DATS'.
630 l_fieldval = 'X'.
631 ELSE.
632 SHIFT <dd03l_fields>-intlen LEFT DELETING LEADING '0'.
633 IF l_totlength IS INITIAL.
634 SHIFT l_totlength LEFT DELETING LEADING '0'.
635 l_totlength = '0'.
636 ENDIF.
637 l_currlength = <dd03l_fields>-intlen.
638 CONCATENATE l_totlength '(' l_currlength ') ' INTO l_fieldval .
639 l_totlength = l_totlength + l_currlength.
640 ENDIF.
641*** Consider both field name and domian name for checking in range list:
642 IF <dd03l_fields>-fieldname IN s_fieldlist.
643 wa_fldcat-fieldname = <dd03l_fields>-fieldname.
644 ELSEIF <dd03l_fields>-domname IN s_fieldlist.
Edited by: anshu13 on Apr 27, 2010 11:33 AM -
Error while loading data into External table from the flat files
HI ,
We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
While loading the data, we are encountering the following error.
Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: un) while loading data into table_ext
Please let us know what needs to be done in this case to solve this problem.
Thanks,
KartheekKartheek,
I used Google (mine still works).... please check those links:
http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
HTH,
Thierry -
Error while Inserting data into flow table
Hi All,
I am very new to ODI, I am facing lot of problem in my 1st interface. So I have many questions here, please forgive me if it has irritated to you.
========================
I am developing a simple Project to load a data from an input source file (csv) file into a staging table.
My plan is to achieve this in 3 interfaces:
1. Interface-1 : Load the data from an input source (csv) file into a staging table (say Stg_1)
2. Interface-2 : Read the data from the staging table (stg_1) apply the business rules to it and copy the processed records into another staging table (say stg_2)
3. Interface-3 : Copy the data from staging table (stg_2) into the target table (say Target) in the target database.
Question-1 : Is this approach correct?
========================
I don't have any key columns in the staging table (stg_1). When I tried to execute the Flow Control of this I got an error:
Flow Control not possible if no Key is declared in your Target Datastore
With one of the response (the response was - "FLOW control requires a KEY in the target table") in this Forum I have introduced a column called "Record_ID" and made it a Primary Key column into my staging table (stg_1) and my problem has been resolved.
Question-2 : Is a Key column compulsary in the target table? I am working in BO Data Integrator, there is no such compulsion ... I am little confused.
========================
Next, I have defined one Project level sequence. I have mapped the newly introduced key column Record_Id (Primary Key) with the Project level sequence. Now I am got another error of "CKM not selected".
For this, I have inserted "Insert Check (CKM)" knowledge module in my Project. With this the above problem of "CKM not selected" has been resolved.
Question-3 : When is this CKM knowledge module required?
========================
After this, the flow/interface is failing while loading data into the intermediar ODI created flow table (I$)
1 - Loading - SS_0 - Drop work table
2 - Loading - SS_0 - Create work table
3 - Loading - SS_0 - Load data
5 - Integration - FTE Actual data to Staging table - Drop flow table
6 - Integration - FTE Actual data to Staging table - Create flow table I$
7 - Integration - FTE Actual data to Staging table - Delete target table
8 - Integration - FTE Actual data to Staging table - Insert flow into I$ table
The Error is at Step-8 above. When opened the "Execution" tab for this step I found the message - "Missing parameter Project_1.FTE_Actual_Data_seq_NEXTVAL RECORD_ID".
Question-4 : What/why is this error? Did I made any mistake while creating a sequence?Everyone is new and starts somewhere. And the community is there to help you.
1.) What is the idea of moving data from stg_1 and then to stg_2 ? Do you really need it for any other purpose other than move data from SourceFile to Target DB.
Otherwise, its simple to move data from SourceFile -> Target Table
2.) Does your Target table have a Key ?
3.) CKM (Check KM) is required when you want to do constraint validation (Checking) on your data. You can define constraints (business rules) on the target table and Flow Control will check the data that is flowing from Source File to Target table using the CKM. All the records that donot satisfy the constraint will be added to E$ (Error table) and will not be added to the Target table.
4.) Try to avoid ODI sequences. They are slow and arent scalable. Try to use Database sequence wherever possible. And use the DB sequence is target mapping as
<%=odiRef.getObjectName( "L" , "MY_DB_Sequence_Row" , "D" )%>.nextval
where MY_DB_Sequence_Row is the oracle sequence in the target schema.
HTH -
Error while extracting data from a remote system
Hi,
I am facing problem while extracting data from a remote system. The connection is alright I can extract the table required from the remote system,but when I deploy it I get this error
ORA-04052: error occurred when looking up remote object [email protected]@ORACLE_UBN_15_LOCATION1
ORA-00604: error occurred at recursive SQL level 1
ORA-28000: the account is locked
ORA-02063: preceding line from UBNDW@ORACLE_UBN_15_LOCATION1
here Scott.demo1 is the table and UBNDW is the sid of the remote system and ORACLE_UBN_15_LOCATION1 is the location. Please help me out with this
ThanksHi,
IDOC's need to be processed manually either in OLTp or in BW depending on the failure. Error msg in monitor status will take u to either BW or OLTP whernever there is a prob. Process IDOC's , this will start the left over packets and will finish the load.
we hav to check IDOC in WE05(t-code) and know the status these are WE51,WE52,WE53 AND GOTO WE19 there we hav to execute the exist Idoc will succesfully loaded Idoc
Goto St22 see the short dump error msg..
post if there any inf..
Thanks,
Shreya -
Error while loading data into clob data type.
Hi,
I have created interface to load data from oracle table into oracle table.In target table we have attribute with clob data type. while loading data into clob field ODI gave below error. I use odi 10.1.3.6.0
java.lang.NumberFormatException: For input string: "4294967295"
at java.lang.NumberFormatException.forInputString(Unknown Source)
at java.lang.Integer.parseInt(Unknown Source)
at java.lang.Integer.parseInt(Unknown Source)
Let me know if anyone come across and resolved this kind of issue.
Thanks much,
Nishit GajjarMr. Gajjar,
You didnt mention what KMs you are using ?
have a read of
Re: Facing issues while using BLOB
and
Load BLOB column in Oracle to Image column in MS SQL Server
Try again.
And can you please mark the Correct/Helpful points to the answers too.
Edited by: actdi on Jan 10, 2012 10:45 AM -
Error while inserting data into a table.
Hi All,
I created a table.While inserting data into the table i am getting an error.Its telling "Create data Processing Function Module".Can any one help me regarding this?
Thanx in advance
anirudhHi Anirudh,
Seems there is already an entry in the Table with the same Primary Key.
INSERT Statement will give short dump if you try to insert data with same key.
Why dont you use MODIFY statement to achieve the same.
Reward points if this Helps.
Manish -
Hi All.
The below is the error, which I am getting while extracting data.
Please let me know the reason, and how to overcome the below error.
Status Log
Warning ORA-02049: timeout: distributed transaction waiting for lock
Error ORA-01555: snapshot too old: rollback segment number 19 with name "_SYSSMU19$" too small
ORA-02063: preceding line from RPMSP@RPMSP_LOCATION
ORA-06512: at "SAI_USER.SA_PROD_SUMM_DYF_MAP", line 11
ORA-06512: at "SAI_USER.SA_PROD_SUMM_DYF_MAP", line 1622
ORA-06512: at "SAI_USER.SA_PROD_SUMM_DYF_MAP", line 2228
ORA-06512: at "SAI_USER.SA_PROD_SUMM_DYF_MAP", line 6253
ORA-06512: at line 1
Regards,Hi,
Even though you have deleted the data from Datatarget, but you have not yet deleted the delta initialization inforamtion at infopackage, before re starting the delta process. So delete the information from BW system , by double click infopackage--> Schedular(menu option) --> 'Initialization option for sourcesystem' --> delete the record.
With rgds,
Anil Kumar Sharma .P -
Getting error while loading Data into ASO cube by flat file.
Hi All,
i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
does anyone have solution.
Regards,
VMAre you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
Cheers
John
http://john-goodwin.blogspot.com/ -
Error while load data into Essbase using ODI
Hi ,
I'm getting the following error while loading measures into Essbase using ODI, I used the same LOG nd Error file and file path for all my Dimensions , this worked well but not sure why this is not working for measures....need help.
File "<string>", line 79, in ?
com.hyperion.odi.common.ODIHAppException: c:/temp/Log1.log (No such file or directory)
Thanks
VenuAre you definitely running it against an agent where that path exists.
Have you tried using a different location and filename, have you restarted the agent to make sure there is not a lock on the file.
Cheers
John
http://john-goodwin.blogspot.com/ -
Error while loading data into BW (BW as Target) using Data Services
Hello,
I'm trying to extract data from SQL Server 2012 and load into BW 7.3 using Data Services. Data Services shows that the job is finished successfully. But, when I go into BW, I'm seeing the below / attached error.
Error while accessing repository Violation of PRIMARY KEY constraint 'PK__AL_BW_RE_
Please let me know what this means and how to fix this. Not sure if I gave the sufficient information. Please let me know if you need any other information.
Thanks
PradeepHi Pradeep,
Regarding your query please refer below SCN thread for the same issue:
SCN Thread:
FIM10 to BW 73- Violation of PRIMARY KEY -table AL_BW_REQUEST
Error in loading data from BOFC to BW using FIM 10.0
Thanks,
Daya
Maybe you are looking for
-
Need to supress main report based on shared value comes from subreport
Hi, I have a database that is used in both main report and subreport. On main report I have column a,b ,c, d,e,f to display in detail section, the subreport (column c, g,h etc) is also displayed on detail section, the link between main report and sub
-
How to get the list of Used Quotations & Non Used Quotations
Hi MM Gurus, How to get the list of Used Quotations & Non Used Quotations. i am not talking about Open quotation ,closed quotation.. if once i created PO through quotation it should be used quotation. i not created PO through quotation it s should be
-
How to make profitability segment pushbutton visible in FB50 transaction
Hi All , I want to make profitability segment pushbutton visible in FB50 transaction. I tried to change a transaction variant which is assigned for this transaction code. It was a user created Z transaction variant and i unchecked the invisible ch
-
I have updated my mac OS and now when I try to sync my pocketmac with BB curve it says"pocketmac quit unexpectedly. I have tried reinstalling the pocketmac software twice. I am trying to sync my calender. Anyone have any answers for me??
-
Tell me about PCH , PAP and PNPCE database using in HR reports
Hi SAP-HR Experts . May anybody tell me about PCH , PAP and PNPCE database using in HR reports . Does anybody have any ABAP report code about using thses LDB's . I always used PNP database in my reports , i never come across in which i should use PC