Data loading issue in Table
Hi Friends,
I am using ODI 11g.
I am doing a Flat file to Table mapping .I have 10 records in flat file while loading the data in Oracle table ,i can see only 1 record is loaded.
I am using IKM SQL control append and using distinct option.
Can you plz let me know where exactly the problem.
Thanks,
Lony
Hi Lony,
Please let us know other KM using in your ODI interface.
please check in flat file, PK column have same value or it idifferent ?
Please check if header is present in your flat file.
when you load file as table in model > right click on the table (flat file adding as table in model) and click on view data and see all 10 records are you able to see in ODI level
Regards,
Phanikanth
Similar Messages
-
Data Load Issue "Request is in obsolete version of DataSource"
Hello,
I am getting a very strange data load issue in production, I am able to load the data upto PSA, but when I am running the DTP to load the data into 0EMPLOYEE ( Master data Object) getting bellow msg
Request REQU_1IGEUD6M8EZH8V65JTENZGQHD not extracted; request is in obsolete version of DataSource
The request REQU_1IGEUD6M8EZH8V65JTENZGQHD was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
I have taken the follwoing action
1. Replicated the data source
2. Deleted all request from PSA
2. Activated the data source using (RSDS_DATASOURCE_ACTIVATE_ALL)
3. Re transported the datasource , transformation, DTP
Still getting the same issue
If you have any idea please reply asap.
SamitHi
Generate your datasource in R/3 then replicate and activate the transfer rules.
Regards,
Chandu. -
I am new to Demantra. Have installed a stand alone Demantra system in our server. In order to load data, I created a new model, defined item and location levels, then clicked on 'Build Model'. The data is loaded into 3 custom tables created by me. After creating the model, I cannot login to 'Collaborator Workbench', it gives message 'There are system errors. Please contact your System Administrator'. Can anyone please tell me what I am doing wrong and how to resolve the issue.
ThanksOk, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.
-
I am having an issue where the data that drives a tilelist
works correctly when the tile list is not loaded on the first page
of the application. When it is put on a second page in a viewstack
then the tilelist displays correctly when you navigate to it. When
the tilelist is placed in the first page of the application I get
the correct number of items to display in the tilelist but the
information the item renderer is supposed to display, ie a picture,
caption and title, does not. The strange thing is that a Tree
populates correctly given the same situation. Here is the sequence
of events:
// get tree is that data for the tree and get groups is the
data for the tilelist
creationComplete="get_tree.send();get_groups.send();"
<mx:HTTPService showBusyCursor="true" id="get_groups"
url="[some xml doc]" resultFormat="e4x"/>
<mx:XMLListCollection id="myXMlist"
source="{get_groups.lastResult.groups}"/>
<mx:HTTPService showBusyCursor="true" id="get_tree"
url="[some xml doc]" resultFormat="e4x" />
<mx:XMLListCollection id="myTreeXMlist"
source="{get_tree.lastResult.groups}"/>
And then the data provider of the tilelist and tree are set
accordingly. I tried putting moving the data calls from the
creation complete to the initialize event thinking that it would
hit earlier in the process and be done by the time the final
completion came about but that didn't help either. I guess I'm just
at a loss as to why the tree works fine no matter where I put it
but the TileList does not. It's almost like the tree and the
tilelist will sit and wait for the data but the item renderer in
the tilelist will not wait. Which would explain why clicking on the
tile list still produces the correct sequence of events but the
visual component of the tilelist is just not working right. Anyone
have any ideas?Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.
-
Data Load to Oracle Tables.
Dear User,
I have a requirement where I need to create a new Apex application that will allow the Planning department to create Item Lists and Locations lists. They will build the lists based on values (location or Item) that they paste from their Spreadsheets into the Apex App.
Location list will be in the below Format.
"02-05-07-14-15-19-20-21-23-25-27-28-44-47-53-54-61-63-65-66-68-69-74-75-77-79-81-85-86-87-89-90-92-95-96-97-98-99-101-103-105-107-111-112-116-119-120-127-130-133-139-143-147"
And Item List will be in the below Format.
7310468
1009521
4490033
4490156
4490318
4490334
4490059
4490083
I need to load the Location List Data and Item_list Data into two different tables "Loc_List_stage" and "Item_list_stage". Every Location should be on a different line and same with the Item List.
Any Help or suggestion will be appreciated.
Thanks!
NSHi,
Go to: SQL Workshop -> Utilities -> Data Workshop
In the Data Load section choose: Text Data
Load To : Existing table
Load From: Copy and paste
Next choose the schema in which your table is present in which you want to load data
and then select the table name.
Then copy and paste the data to be uploaded and then can upload the data.
See if it works!
Regards,
Kiran -
BI 7.0 data load issue: InfoPackage can only load data to PSA?
BI 7.0 backend extraction gurus,
We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS.
After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view. In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table).
Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed! In the Data Target tab, find the ODS as a target can't be selected! Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process! Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS! Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
Many new features with BI 7.0! Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!You dont have to select anything..
Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
Go through the links for Lucid explainations
Infopackage -
http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
Creating DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
<b>Pre-requisite-</b>
You have used transformations to define the data flow between the source and target object.
Creating transformations-
http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
Hope it Helps
Chetan
@CP.. -
Data loading from one table to another
Hi,
I want to load some data from a temp table to a master table. The master is having 40million records and the temp table is having 23 million records. Master table is having around 50 columns and we are adding 4new columns and the temp table is having 5columns. The data for these 4new columns are available in the temporary table also the employee column is there in common to these two table.
I used a stored procedure to load the data, whcih uses a cursor. But its taking more that 6hours to load.
Can any one suggest me a good technique to load data faster?
Thanks,
Santhosh.hi consider this case scenario which matches with yours.
first of all you have to update not insert in master table.
master table = emp with columns (emp_id, emp_name, emp_designation)
to this original master table you added two more columns emp_salary, emp_department
so now your master table looks like emp_id, emp_name, emp_designation, emp_salary, emp_department
but when you do select * from master table, the last two columns salary & department are blank.
Now you have another temp table with folllowing columns (emp_id, emp_salary, emp_department)
now emp_id is common to master & temp tables & you want to put values from temp table into master tables? I think this is what ur trying to do..
so for the above case the query i would write is
update master_table m set m.emp_salary=(select t.emp_salary from temp_table t where
t.emp_id=m.emp_id);
commit;
Regds. -
DATA LOAD ISSUE /NO ROLL UP MEMORY
Hello Team,
I have this master data load failure for FIS_BELNR. The problem which I think that everytime it tries to load it runs out of internal table space in the backend. I really don't know what it means. This data is been failing everyday because of the same problem.
I have attached all the screen shots and ABAP short dump analysis screen shots as well. One can read in details why is it failing it tells exactly what the problem is , but how to fix it.
Any more details needed please let me know.
ABAP runtime errors TSV_TNEW_BLOCKS_NO_ROLL_MEMORY
Occurred on 25.10.2007 at 02:53:55
>> Short dump has not been completely stored. It is too big.
No roll storage space of length 2097424 available for internal storage.
What happened?
Each transaction requires some main memory space to process
application data. If the operating system cannot provide any more
space, the transaction is terminated.
What can you do?
Try to find out (e.g. by targetted data selection) whether the
transaction will run with less main memory.
If there is a temporary bottleneck, execute the transaction again.
If the error persists, ask your system administrator to check the
following profile parameters:
o ztta/roll_area (1.000.000 - 15.000.000)
Classic roll area per user and internal mode
usual amount of roll area per user and internal mode
o ztta/roll_extension (10.000.000 - 500.000.000)
Amount of memory per user in extended memory (EM)
o abap/heap_area_total (100.000.000 - 1.500.000.000)
Amount of memory (malloc) for all users of an application
server. If several background processes are running on
one server, temporary bottlenecks may occur.
Of course, the amount of memory (in bytes) must also be
available on the machine (main memory or file system swap).
Caution:
The operating system must be set up so that there is also
enough memory for each process. Usually, the maximum address
space is too small.
Ask your hardware manufacturer or your competence center
about this.
In this case, consult your hardware vendor
abap/heap_area_dia: (10.000.000 - 1.000.000.000)
Restriction of memory allocated to the heap with malloc
for each dialog process.
Parameters for background processes:
hat can you do?
ry to find out (e.g. by targetted data selection) whether the
ransaction will run with less main memory.
f there is a temporary bottleneck, execute the transaction again.
f the error persists, ask your system administrator to check the
ollowing profile parameters:
ztta/roll_area (1.000.000 - 15.000.000)
Classic roll area per user and internal mode
usual amount of roll area per user and internal mode
ztta/roll_extension (10.000.000 - 500.000.000)
Amount of memory per user in extended memory (EM)
abap/heap_area_total (100.000.000 - 1.500.000.000)
Amount of memory (malloc) for all users of an application
server. If several background processes are running on
one server, temporary bottlenecks may occur.
Of course, the amount of memory (in bytes) must also be
available on the machine (main memory or file system swap).
Caution:
The operating system must be set up so that there is also
enough memory for each process. Usually, the maximum address
space is too small.
Ask your hardware manufacturer or your competence center
about this.
In this case, consult your hardware vendor
bap/heap_area_dia: (10.000.000 - 1.000.000.000)
Restriction of memory allocated to the heap with malloc
for each dialog process.
arameters for background processes:
Error analysis
The internal table "IT_62" could not be enlarged further.
You attempted to create a block table of length 2097424 for the internal
table "IT_62". This happens whenever the OCCURS area of the internal table
is exceeded. The requested storage space was not available in the roll
area.
The amount of memory requested is no longer available.
How to correct the error
Please try to decide by analysis whether this request is
reasonable or whether there is a program error. You should pay
particular attention to the internal table entries listed below.
The amount of storage space (in bytes) filled at termination time was:
Roll area...................... 2595024
Extended memory (EM)........... 2001898416
Assigned memory (HEAP)......... 1886409776
Short area..................... 16639
Paging area.................... 24576
Maximum address space.......... "-1"
If the error occurred in a non-modified SAP program, you may be
able to find a solution in the SAP note system.
If you have access to the note system yourself, use the following
search criteria:
"TSV_TNEW_BLOCKS_NO_ROLL_MEMORY"
"SAPLZ_BW_EXTRACTORS " or "LZ_BW_EXTRACTORSU24 "
"Z_BW_AP_GL_BELNR"
If you cannot solve the problem yourself, please send the
following documents to SAP:
1. A hard copy print describing the problem.
To obtain this, select the "Print" function on the current screen.
ThanksHello,
The memory of your internal table wants beyond the system configured threshold.
Decrease your package size or extend the mentioned parameters (basis task):
ztta/roll_area (1.000.000 - 15.000.000)
Classic roll area per user and internal mode
usual amount of roll area per user and internal mode
ztta/roll_extension (10.000.000 - 500.000.000)
Amount of memory per user in extended memory (EM)
abap/heap_area_total (100.000.000 - 1.500.000.000)
Regards, Patrick Rieken -
Mainframe data loaded into Oracle tables - Test for low values using PL/SQL
Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Some columns from the mainframe data had 'low values' in them. These columns were defined on the Oracle tables as varchar2 types. In looking at the data, some of these columns appear to have data that looks like little square boxes, not sure but maybe that is the way Oracle interprets the 'low values' in the original data into varchar. When I run a select to find all rows where this column is not null, it selects these columns. In the results of the select statement, the columns appear to be blank, however, in looking at the data in the column using SQL Developer, I can see the odd 'square boxes'. My guess is that the select statement is detecting that something exists in this column. Long story short, some how I am going to have to test this legacy data on the Oracle table using Pl/Sql to test for 'low values'. Does anyone have any suggestions on how I could do this????? Help! The mainframe data we are loading into these tables is loaded with columns with low values.
I am using Oracle 11i.
Thanks
Edited by: ncsthbell on Nov 2, 2009 8:38 AMncsthbell wrote:
Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Not a wise thing to do. Mainframe operating systems typically use EBCDIC and Unix and Windows servers use ASCII. The endian is also different (big endian vs little endian).
Does anyone have any suggestions on how I could do this????? As suggested, use the SQL function called DUMP() to see the actual contents (in hex) of these columns. -
Error in 0EMPLOYEE Master Data Load Issue
Hi,
We have 0EMPLOYEE Master Data. Due to our new development changes ralted to 0EMPLOYEE, we have scheduled 2 new info packages with Personnel number range. While creation of infopackages, we forget to main time interval from 01.01.1900 to 31.12.9999. Instead of this, the default range was selected as 24.04.2009 to 31.12.9999. Because of this selection in InfoPackage, the Employee Master Data Valid from date was changed to 24.04.2009 for all the employees in the master data after the data load.
Even after i change this selection properly and loading the data also, its not correcting with correct valid from dates.
Can you please advice, how can we fix this issue ASAP as its a production issue?
Thanks!
Best regards,
Venkata> Even after i change this selection properly and loading the data also, its not correcting with correct valid from dates.
May be for this you have the ONLY option to delete 0Employee master data and reload it again. For this you need to delete dependent transaction data also.
Cheers,
Sree -
Hi gurus,
Presently i am working on BI 7.0.I have small issue regarding master data loading.
I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
How can i handle this.
Best Regards
PrasadHi Prasad,
Following is happening in your case:
<b>Loading 1st Time:</b>
1. Data loaded to PSA through ipack.It is a full load.
2. data loaded to infoobject through DTP.
<b>Loading 2nd Time:</b>
1. Data is again loaded to PSA. It is a full load.
2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
Please clear the PSA after the data is loaded to infoobject.
Assign points if helpful.
Regards,
Tej Trivedi -
2LIS_11_VAHDR Data Load Issue
Hello Gurus,
I have a cube which sources data from InfoSource 2LIS_11_VAHDR. ( It also sources from 2LIS_11_VAITM. )
The data-source, infopackage, update-rules etc. all is inplace.
(a)<b> In Monitor:</b>
Whenever I try to load data, in RSMO, I get - "yellow triangle" - ( 0 from 0 records ).
In "Detail" tab, I see
--- Extraction (messages): Errors Occured (yellow triangle)
Data request received ( green square )
Data selection scheduled ( green square )
No data available, data selection ended ( yellow triangle).
(b) <b>In InfoPackage</b>:
In "Update tab" the selection is "full update"
(c) <b>I tried:</b>
I tried to do "Replicate DataSources", then "Activated Update Rules, as well as InfoSource". Again ran the data load and checed in RSMO ( monitor), still see the ( 0 From 0 Records ).
How do I go about resolving this?
Would appreciate your help.
Thanks for your time.
Fellow Developer..
Pramod.Hi,
to check the data in setup tables
go to RSA3>give the data source name>click on extractor
note:when using RSA3, find out what the value of the "Update Mode" is before executing. If the "Update Mode" is F or C, it will look at the data from the setup tables. If it is D or R, then it will check the data in the delta queue. Use the drop down menu for the parameter to see what the different values are for "Update Mode".
option 2
For the datasource 2LIS_11_VAHDR
take the extract structure name which you can find in LBWE>select your application area>click on extract structures>pick the extract structure name for your datasource
ex:for SD sales BW application area(11)
the extract structure is MC11VA0HDR.
then
Go to se11 give the extract structure + Setup i.e MC11VA0HDRSETUP.
You can see the data.
Cheers,
Swapna.G -
Data Loading issues in PSA and Infocube
Hi team,
i am loading data into PSA and from there into info cube via a DTP,
when i load data into PSA all 6000 records are transfered to PSA and process is completed sucessfully,
When i execute the DTP to load data into info cube the data load process is completed sucessfully but when i observe
the "manage" tab i see
"Transferred 6000 || Added Records 50"
i am not able to get as why only 50 records are loaded into infocube and if some records are rejected where can i find them and the reason for that,
kindly assist me in understanding this issue,
Regards
bshi,
The records would have got aggregated based on common values.
if in source u had
custname xnumber kf1
R 56 100
R 53 200
S 54 200
after aggregation u will have
custname kf1
R 300
S 200
if u do not have xnumber in your cube.
Hope it is clear now.
Regards,
Rathy -
Hi all,
i have some issue with DSO data load , while loading the data data comming to psa perfectly 238 ware comes, but when i trigger that DTP, i am getting only 6 records.
Please any one suggest me.
Thanks,
Gayatri.Hi Gayatri,
if you have already loaded some data to DSO & now if you are trying to do Delta, it is possible that it is taking only Delta Data??
(or)
If you have any Start/End routines/Rule Routines written for deleting any records based on some conditions.
(or)
Also It depends on the keyfield you have selected in the DSO. If the keyfield which you have selected is having some repeated values, then while loading into DSO, it will be aggregated i.e if you have 10 rows for keyfield with value say 101, then in the DSO it will be loaded with only one row with value 101 (10 rows becoming 1 row) & with the keyfigure either Summation/Overwritten depending on what you have selected in the rule details for the keyfigure(you can check it by right click on Keyfigure mapping> Rule Details> here you can see whether it is Overwrite/Summation).
Also as mentioned in above posts you can check the DSO --> manage --> Check the number of rows transferred & number of rows added.
Hope it is clear & helpful!
Regards,
Pavan -
I am doing a test in BI 7.0, Datasource to PSA.
overall status and technical status turn to green, but when i go to detail, i saw a warning:
requests (messages): Request confirmed
Missing message: Request confirmed
So i goto source system, SM37 and search the job BI*, they were all finished successfullly. I back to BW side and refresh the monitor, the data loading was still pending there.
If i update PSA to data target via transfer rule / update rule, it will failed due to time out.
If i debug the transfer rule / update rule, everything is fine.
If i simulate upload data, it is fine.
I have checked RSRV, everything is fine.
Any hint / clue on this strange case?
P.S: Actually in BW monitor, I saw the below information (to PSA). It is really wierd because the overall status was set to green by system already but it is still running!
Request still running
Diagnosis
No errors found. The current process has probably not finished yet.
System Response
The ALE inbox of BI is identical to the ALE outbox of the source system
or
the maximum wait time for this request has not yet been exceeded
or
the background job has not yet finished in the source system.
Current status
Data request was not successfully sent offHi,
Follow the steps .
If the loads are running past so many days then follwo the steps.
1. Check the previous requests ( Data Target Level).So that you will get idea how much loading time is taking in BI.
If loading is Full :
Check the data in RSA3 level.If Data available in R/3 then try tro load data into PSA first.If data is not there check the data at the table levl.
If it Delta :
Check the Delta queue as well as R/3 background Jobs for the particular DS.
Thanks,
Maybe you are looking for
-
I have uninstalled all traces of Itunes including Mobile and Apple support but when I start installing the latest itunes 11 it would prompt that that it's ready for installation and would now show up in my programs installed.
-
Oracle Forms.. Master/Detail problem
Hi,.. I have a problem.. I have 2 blocks in a page. One is an single row block and the other one an multirow block.. master/detail relationship. I would like that when I change a specifike item in the master block, a column of a specifike record gets
-
Hi , This question is not about design but someting related to design activity.I have been assinged with a duanting task of reverse engineering i.e code to model and have to represent the flow of the business services as sequence diagrams. Is there a
-
Trying to figure out what the numerical code is for renting movies. I'm logged into iTunes and when I want to rent a movie it asks for a code. Any help is much appreciated... Bart
-
I just acsidantly buy Aperture but i changed my mind. How can i get my money back?
I just accidently bought Aperture from App store. However I changed my mind I how can I get my money back?