Needs Partial data load in PSA
Hi All
While loading Employee data from Data Source (File Source System) to PSA, i got error in 1 record, so PSA was not loaded at all. What i want is: it should load all other valid records ignoring the bad record.
Pls. help me in solving this small issue.
Thanks in Advance
Harpal
Edited by: Harpal Singh on Nov 12, 2009 11:41 PM
Hi ,my friend,
If you are not able to see the selection in the datapackage for the particular field the you need to select the field (i mean check the in R/3) save the data source and replicate the data source in Bw system .
Now go to the infopackage ---> data selection ---> there give your filter value for the field.
(also you can write abap routine in the infopackage level next to from value and to value , select 6 type variable change the system will prompt you to write a routine at the infopackage level) this is Optional.
Apart from this you can filter in the data selection ( from and tooo oprions in the infopackage)
selsec it and load the data
Santosh
Edited by: Santhosh Nagaraj on Nov 13, 2009 10:24 PM
Similar Messages
-
Multiple data loads in PSA with write optimized DSO objects
Dear all,
Could someone tell me how to deal with this situation?
We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
Does any of you have a solution for this?
Thanks in advance.
HaraldHi Ajax,
I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
Regards,
Harald -
Data loading through PSA in 3.5
Hi experts,
I have always worked with 7.0, so when i create a data load chain for a infocube, i first delete index, then execute the infopackage (with only to psa), execute the DTP and create index again.
Now i am working on a 3.5 and the DTP do not exist :S, how should i proceed to transfer the data after executing the infopackage?? I prefer doing it through PSA if possible.
Thank-you very much,
Artur.Hi Artur,
The difference between 3.5 and 7.0 is
- Infopackage brings the data till PSA in 7.0 and then you have to execute DTP to bring data from PSA to data target. In 3.5 Infopackage brings the data directly to Data target with or without PSA depends upon the update type in processing tab you select in Info package.
In case of 3.5 you have to follow the same step, Delete Index, Load data through Infopackage (With PSA and data target (package by Package) and create index.
In 7.0 version SAP broke one process in two parts, one is till PSA (through Info Package) and other is from PSA to Data target (DTP) for better control and to improve ETL process.
Regards,
Kams -
Assistance Needed on Data Load Monitoring
presently i'm working in a support proj which is data load monitoring ( mainly process chains) , As a result of which i will the responsible person to fix some of the loads when it fails eg : Duplicate records, errorneous records, attrbutute chain run etc .
As a fresher in BW i need to learn some of the things like Types of updates, process types & process chains proficiently thick & fast , i request you to please send in some PDF's or ppt regarding the same , if any books are there then please suggest one.
i Need mainly regarding
Dataflow & update Types Briefly .
process Types , chains
Delta mechanism .
Hope you will bail me out of this situation.
Thanks in Advance ,
HPHere are all the links:
sap document on process chains:
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/8da0cd90-0201-0010-2d9a-abab69f10045
help.sap.com link for creation of process chains:
http://help.sap.com/saphelp_nw04/helpdata/en/67/13843b74f7be0fe10000000a114084/frameset.htm
helpful threads on issues related to process chains:
Process Chains creation and monitoring
Process Chain
Re: OPEN HUB with Process Chain
http://help.sap.com/saphelp_nw04/helpdata/en/67/13843b74f7be0fe10000000a114084/frameset.htm
Re: OPEN HUB with Process Chain
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3806122
Re: attribute change run
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3790051
Re: Process Chain Scheduling
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=4426820
Re: Process Chain
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3692343
Re: Process Chain
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3841608
Re: Can a process can have more than one variant?
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3855067
Re: about the open hub...!
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3700389
Re: Process chain statistics
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=4162319
Re: Process Types in BI 7.0
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3759916
Re: How to deactivate the local chain in Main chain
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3591699
Re: changing a process type during running of process chain
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3750639
Re: Node for Process chain--Urgent
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3692720
Re: Process Chains Events
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=3707415
Re: process chain event trigger problem
https://forums.sdn.sap.com/click.jspa?searchID=6646868&messageID=4374184
Thanks.,...
Shambhu -
Partial data loading from dso to cube
Dear All,
I am loading the data from dso to the cube through DTP. The problem is that some records are getting added through data
packet 1 to the cube around 50 crore records while through data packet2 records are not getting added.
It is full load from dso to cube.
I have tried deleting the request and executing DTP but again the same no of records get added to the cube through datapacket 1 and after that records are not added to the cube through datapacket 1 ;request remains in yellow state only.
Please suggest .Nidhuk,
Data load transfers package by package. Your story sounds like it got stuck in second package or something. Suggest you check package size, try to increase to a higher number to see if anything changes ? 50 records per package kind of low, your load should not spread out into too many packages.
Regards. Jen
Edited by: Jen Yakimoto on Oct 8, 2010 1:45 AM -
Need faster data loading (using sql-loader)
i am trying to load approx. 230 million records (around 60-bytes per record) from a flat file into a single table. i have tried sql-loader (conventional load) and i'm seeing performance degrade as the file is being processed. i am avoiding direct path sql-loading because i need to maintain uniqueness using my primary key index during the load. so the degradation of the load performance doesn't shock me. the source data file contains duplicate records and may contain records that are duplicates of those that are already in the table (i am appending during sql-loader).
my other option is to unload the entire table to a flat file, concatenate the new file onto it, run it through a unique sort, and then direct-path load it.
has anyone had a similar experience? any cool solutions available that are quick?
thanks,
jeffIt would be faster to suck into a Oracle table, call it a temporary table and then make a final move into the final table.
This way you could direct load into an oracle table then you could
INSERT /*+ APPEND */ INTO Final_Table
SELECT DISTINCT *
FROM Temp_Table
ORDER BY ID;This would do a 'direct load' type move from your temp teable to the final table, which automatically merging the duplicate records;
So
1) Direct Load from SQL*Loader into temp table.
2) Place index (non-unique) on temp table column ID.
3) Direct load INSERT into the final table.
Step 2 may make this process faster or slower, only testing will tell.
Good Luck,
Eric Kamradt -
Data Load to PSA cuts last Symbol
Hi
For DataSource 3.x i have PSA with DEC key figure
when i try to load the value equal -118964643632,91 into psa it stores like -118964643632,9 in result i lost 1
is it possible to do something with this error?Hi there,
the number of decimal places in your keyfigure's definition is valid only for display purposes in the environment of business explorer.
Check in the dictionary (se11) how your infoobject is defined. Normally, it should be defined as data type DEC with length 17 and decimal places 3 if the data type of your infoobject is number.
You can find your infoobject if you put in the dictionary in the data type field the name /BIC/OIAAAAA where AAAAA is the technical name of your infoobject.
Regards,
Theodoros -
Issue when doing Delta loads from PSA to DSO using DTP - Pls help Urgent
Hi All,
I have done 3 data loads into PSA and from PSA, iam loading the data into DSO - by splitting that load into 3 using 3 DTPs. (Split that by Fiscal Period).
2 of the DTP loads are extracting the data from 3 PSA requests.
But, one of the DTP load is filtering on ONE PSA request.
So, we are not getting the full data into DSO.
Can some one pls help me why the DTP load is beheaving like this ???
Even though i have not given any filters for Request id, still , for one load its picking up the data by filtering on One Req ID.
Cheers,
NishaHi Jr,
Sorry for late reply.
I think i found the solution, the diff between the DTP's is i ahve ticked " Get request one after another " .
I have changed it now and its working fine.
Thanks,
Nisha -
Hi gurus,
Presently i am working on BI 7.0.I have small issue regarding master data loading.
I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
How can i handle this.
Best Regards
PrasadHi Prasad,
Following is happening in your case:
<b>Loading 1st Time:</b>
1. Data loaded to PSA through ipack.It is a full load.
2. data loaded to infoobject through DTP.
<b>Loading 2nd Time:</b>
1. Data is again loaded to PSA. It is a full load.
2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
Please clear the PSA after the data is loaded to infoobject.
Assign points if helpful.
Regards,
Tej Trivedi -
Any table for the data in ODS, PSA, INFOCUBE
Hi All,
We are loading data to PSA and then to ODS and then to INFOCUBE , in such condition is there any table where we can check the number of data that can be loaded in each run into PSA and then to ODS and then to INFOCUBE .
When u check the data in the PSA you will see (Suppose):
Processing PSA
Update Mode Full update
Records Check 32 Records Received
Now here it is showing the number of data loaded to PSA then there must be one table where it is loading the data ..
Do anyone have the idea about the table.
Thanks,
GautamHi,
You can check the data in the PSA table and Active table of ODS.
PB -
Met issue in generic master data load
Hi Experts,
i met an issue which shows:System error: RSDU_ANALYZE_TABLE_ORA/ ORACLE when updating data from PSA to infoobject.
the scenario is:
1.i have created a generic master datasource on VBAK table in R3, and only extract four fields to BW, they are AEDAT, VBELN, VKBUR, VKORG.
2. on BW side, created one infoobject s_saled(sales document) along with the other three characteristics as its attribute.
3. then create direct infosource to load data from R3 to the infoobject s_saled.
4. infopackage to load data
data load to PSA successfully, but met issue at the update step in the monitor. the error message is 'System error: RSDU_ANALYZE_TABLE_ORA/ ORACLE'. it says that 6279 records are new but 0 changed.
but when i checked the infoobject, i found that the data are already loaded here.
can anyone give me some advice on this?
thanks in advance!
Eileenhi eileen,
if u face any error you can simulate it in the monitor in the details tab.
as you say the master data is already loaded the purpose of your action is done.
and you didnt tell if this error persist in subsequent loads.
thank you. -
All Data is not loaded from PSA
Hi,
I have loaded 0material_attr data to Master data 0material.
I got records to PSA whatever data availble in R/3 after Infopackage execution,
After execution of DTP i ddint get all fields of PSA data to 0material.
In transformation whatever fields i mapped those fields data available in PSA but not in 0master.
And i restricted the fields in Infopackage but in DTP it is not extraction only PSA fileds. its extracting lot of fields, how can i restrict in DTP.
If i want delta records means for next time what i have to do..
Urgent reply pls..
Thanks in Advancea. check your transformation- make sure key field is set correctly.
b. no need to restrict in DTP unless you need a language or something.
c. delta records will come through fine if your dtp has delta on it.
Works for us.
cheers,
Sarah -
Data loaded from a non existing PSA
Hi all,
I have this problem:
I am loading an infocube from a PSA with a DTP.
The problem is that data in the infocube is duplicated when reading data from that PSA.
In addition if all PSA requests are deleted data arrives well to the infocube not duplicating registers.
It is working as if there would be a PSA hidden associated to the datasource.
Does anybody know how to delete PSA data? (not requests as I have deleted all)
Program RSAR_PSA_CLEANUP_DIRECTORY does not work...Thanks all but,
there ara no PSA requests. I have deleted all since 1900 manually and via process chain and we need a full upload, not a delta one.
The problem is that even there ara no PSA requests, somewhere data is stored and the DTP is finding them and loading them to the infocube.
We have found the PSA BIC/B00001930000 and would like to try deleting its data (not requests as they are all yet deleted) -
Need help with loading master data from R/3 to BI 7.0.
Hi,
First i thank everybody who took efforts to answer for my posting, i really am learning this new version with your helps. I really appreciate it.
Could any one help me with a step by step process to load master data from R/3 to BI 7.0. Please don't send help.sap.com.
will assign points .
With Thanks,
Ranjani RHi,
Thanks for the answers. I tried loading it yesterday, i had lot of confusions.
What should i do to load a master data from a R/3 to BI7.0.
1. Created a Info object named (EKKO_mas) with some attributes (ernam, ekorg, lifnr).
2. Go to info provider and right clicked on info area and selected insert char as data target. (Please correct if i am wrong).
3. Login to R/3, go to sbiw generic data source created one with a view/table as EKKO and map the application compound and select the needed fields by checking the checkbox. (please correct if wrong).
4. Go to source system tab and replicate data source. ( Please correct if i am wrong)
Then what should i do.
guessing steps:
4. Create a Data source, in BI 7.0 . In that as i am not using flat file, should i select "application server" instead of "local workstation" . In field tab i specified all the fields given in Info object. ( Will there be anything else i should do or change in this tab)
5. Load data in PSA. ( will the data from R/3 i selected loaded in PSA without any problem)
6. Create transformation , save and activate.
7. Create DTP save and activate , click execute.
By doing the above given step will i be able to load the master data from R/3 to BI 7.0 data target (IO) . Please help me if something is wrong with my steps.
Thanks.
RR.
*will assign points. -
The data has been updated-loaded upto PSA only
Hi,
The data has been updated-loaded upto PSA only even though the processing type
has update into PSA and then into data targets. Please let us know what can be problem and what will be the solution for this.
Regards,
RaghavHi Kolli,
Do you have the further update from PSA process in your process chain? You will need this if the InfoPackage has the 3rd option on the Processing tab (PSA and then to data targets).
Hope this helps...
Maybe you are looking for
-
Deleting a file that doesn't want to be deleted!
i cant delete a partially downloaded file or folder from the desktop, it will not go into trash, it says "cant be moved to trash because it cant be deleted" any ideas?
-
hi all i am using host command in forms 10g. but it is not executing . is there any distinct syntax or pre-requirement for executing this command kindly help thanks
-
How do you sync more than one itune at a time to your ipad?
how do you sync more than one itune at a time to your ipad?
-
The AIR SDK we used is 1/14/2014 Release - AIR 4. The device we have tested on is Samsung Galaxy Note 3/Android 4.4. We found that it's quite similar with Bug 3681788 which has been fixed in Beta version 4.0.0.1619. We checked the issue again with th
-
XML to Binary Performance Issues
All - I'm in the process of trying to convince a client to move toward WLI for their workflow needs. One major function of their business is the transformation of binary files to XML. These binary files are typically pipe-delimited with 22 fields per