Data loading through PSA in 3.5

Hi experts,
I have always worked with 7.0, so when i create a data load chain for a infocube, i first delete index, then execute the infopackage (with only to psa), execute the DTP and create index again.
Now i am working on a 3.5 and the DTP do not exist :S, how should i proceed to transfer the data after executing the infopackage??  I prefer doing it through PSA if possible.
Thank-you very much,
Artur.

Hi Artur,
The difference between 3.5 and 7.0 is
- Infopackage brings the data till PSA in 7.0 and then you have to execute DTP to bring data from PSA to data target. In 3.5 Infopackage brings the data directly to Data target with or without PSA depends upon the update type in processing tab you select in Info package.
In case of 3.5 you have to follow the same step, Delete Index, Load data through Infopackage (With PSA and data target (package by Package) and create index.
In 7.0 version SAP broke one process in two parts, one is till PSA (through Info Package) and other is from PSA to Data target (DTP) for better control and to improve ETL process.
Regards,
Kams

Similar Messages

  • Multiple data loads in PSA with write optimized DSO objects

    Dear all,
    Could someone tell me how to deal with this situation?
    We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
    The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
    For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
    As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
    We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
    Does any of you have a solution for this?
    Thanks in advance.
    Harald

    Hi Ajax,
    I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
    The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
    Regards,
    Harald

  • Data load through DTP giving Error while calling up FM RSDRI_INFOPROV_READ

    Hi All
    We are trying to load data in Cube through DTP from DSO. In the Transformation, we are looking up Infocube data through SAP Standard Function Module 'RSDRI_INFOPROV_READ'. The Problem we are facing is that our loads are getting failed & it is giving error as 'Unknown error in SQL Interface' & Parallel process error.
    In the DTP, We have Changed the No. of Parallel processes from 3 (default) to 1 but still the above issue exists with data loads.
    We had similar flow developed in 3.5 (BW 3.5 Way) where we had used this Function Module 'RSDRI_INFOPROV_READ' & there our data loads are going fine.
    We feel there is compatability issue of this FM with BI 7.0 data flows but are not sure. If anybody has any relevant inputs on this or has used this FM with BI 7.0 flow then please let me know.
    Thanks in advance.
    Kind Regards
    Swapnil

    Hello Swapnil.
    Please check note 979660 which mentions this issue ?
    Thanks,
    Walter Oliveira.

  • How to stop the data loads through process chains

    hi,
    I want to stop all the data loads to BI through Process chains where load happens periodic.
    kindly suggest how can I proceed.

    Hi,
    Goto RSPC find your PC and double click on START then change the timings, i.e. give starting date is 01.01.9999 like that Save and ACtivate the PC, it won't start till 01.01.9999.
    Thanks
    Reddy

  • Log on data load through a BW data flow

    Dears,
    I am requesting to all of you who have already implemented this type of functionality. I am trying to find the easiet way, with less complexity, to implement a log through an existing BW data flow.
    I mean data load by an infopackage give some log on right and wrong records within the monitor, how can I utilize this information? is there a specific table which stored each record and their message? Or a program has to be implemented which will publish laoding status in a specific table?
    Thanks for your quick feedback,
    LL

    Hi Ludovic
    The monitor messages are only written if there is some problem in the record processing. You can only find information for those records which have problem or if the processing during the routines encountered some problem.
    What you can do to capture messages is write one transfer routine and amend the monitor messages table rsmonmess for the same.
    Also,please check the tables starting with RSMO*
    regards
    Vishal

  • Suggest good strategy for data load through standard datasources

    Hi BW Gurus,
    we currently are using standard purhasing related datasources. We forsee new reports coming in later based on the standard datasources.
    Can you please suggest a good general startegy to follow to bring in R/3 data. Our concerns are towards data loads [ initializations etc..] as some of the standard datasources are already in production.
    please advice.

    Hi
    go through these web-blogs -  From Roberto Negro it may help you.
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    Regards,
    Rajesh.

  • Problem in data loading through UD Connect.

    Hello All,
    i have problem in data loading...I have created the UD connect source systen and its working properly,then i have created UD Connect InfoSources and DataSources, But when i am trying to create Infopackages then its giving following error..
    "Error occurred in the source system" and "no record found".
    Thanks
    Shivanjali

    Hello Shivanjali,
    Mostly UDC is used for RemoteCube to access data outside of SAP. What is your external system? Make sure that there is data in the source system.
    Please check following links
    [Transferring Data with UD Connect|http://help.sap.com/saphelp_nw04s/helpdata/en/43/e35b3315bb2d57e10000000a422035/frameset.htm]
    [SAP BW Universal Data Integration SAP NW Know-How Call|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f094ba90-0201-0010-fe81-e015248bc5dd]
    [SAP BW and ETL A Comprehensive Guide|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/11e1b990-0201-0010-bf9a-bf9d0ca791b0]
    Thanks
    Chandran

  • Flat file data load through appln server # gets generated at each line

    Hi all,
    I am loading the data from flat file which is placed on the application server.
    At the end of each record i can see a # getting generated.
    While loading the data in bw 3.5 i am getting an error message. Can anyone pls let me know how to handle this issue through application server!!
    Thanks
    Pooja

    Hi Pooja,
    I had faced similar kind of issue in my earlier project.
    The issue is the way of uploading file to application server. Again, i m not say the format of file, but the way the file is uploaded. Normally there are 2 type of uploading, Binary and ASCII. If the file format is CSV and the Appl Server is Windows than, ASCII format is recommended. If it is uploaded in any other format, then the newline is not identified and converted to # - which result to error while loading the data.
    I had to put lot of time to find out the correct combination. And then later on train the user to upload it right way. My user used to upload the data by using some FTPPro software.
    Even after training, i had to have a manual check the file through AL11 everytime before load but after sometime user got trained and the problem was solved.
    Hope this helps.
    Regards
    Raj

  • Data Extract and Data Load Through EPMA

    Hi All,
    Hope you all are doing well,
    Can we Extract Data from classic application and Load Data into another classic application through EPMA, Is it written in any document of oracle.
    If yes then please help me its urgent.
    Thanks,
    Avneet
    Edited by: Avneet on Mar 16, 2011 1:31 AM
    Edited by: Avneet on Mar 16, 2011 2:29 AM

    Hi John,
    I have tried a lot using ODI, i have raised some forum also but i am not able to do it through ODI, in ODI i am using it through report script and getting some error
    Re: How to extract data From Hyperion Essbase to Flat Files
    Thanks,
    Avneet

  • Needs Partial data load in PSA

    Hi All
    While loading Employee data from Data Source (File Source System) to PSA, i got error in 1 record, so PSA was not loaded at all. What i want is: it should load all other valid records ignoring the bad record.
    Pls. help me in solving this small issue.
    Thanks in Advance
    Harpal
    Edited by: Harpal Singh on Nov 12, 2009 11:41 PM

    Hi ,my friend,
    If you are not able to see the selection in the datapackage for the particular field the you need to select the field (i mean check the in R/3) save the data source and replicate the data source in Bw system .
    Now go to the infopackage ---> data selection ---> there give your filter value for the field.
    (also you can write abap routine in the infopackage level  next to from value and to value , select 6 type variable change  the system will prompt you to write a routine at the infopackage level) this is Optional.
    Apart from this you can filter in the data selection ( from and tooo oprions in the infopackage)
    selsec it and load the data
    Santosh
    Edited by: Santhosh Nagaraj on Nov 13, 2009 10:24 PM

  • Data Load to PSA cuts last Symbol

    Hi
    For DataSource 3.x i have PSA with DEC key figure
    when i try to load the value equal -118964643632,91 into psa it stores like -118964643632,9 in result i lost 1
    is it possible to do something with this error?

    Hi there,
    the number of decimal places in your keyfigure's definition is valid only for display purposes in the environment of business explorer.
    Check in the dictionary (se11) how your infoobject is defined. Normally, it should be defined as data type DEC with length 17 and decimal places 3 if the data type of your infoobject is number.
    You can find your infoobject if you put in the dictionary in the data type field the name /BIC/OIAAAAA  where AAAAA is the technical name of your infoobject.
    Regards,
    Theodoros

  • Data load through ABAP Praogram

    Hi,
    I have an issue with a data source that I need ti load data through ABAP Programming.
    Can anyone suggest me step by step analysis for extraction of data through ABAP Programming.

    I think my question is not clear.
    I have a table 'DD02T' for which:
    In Ecc it has 70000 records while in BI it is fetching only 20000 records.
    below is the code:
    can anyone suggest me how can i get the total records in BI.

  • Master data loading issue

    Hi gurus,
        Presently i am working on BI 7.0.I have small issue regarding master data loading.
        I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
       How can i handle this.
    Best Regards
    Prasad

    Hi Prasad,
    Following is happening in your case:
    <b>Loading 1st Time:</b>
    1. Data loaded to PSA through ipack.It is a full load.
    2. data loaded to infoobject through DTP.
    <b>Loading 2nd Time:</b>
    1. Data is again loaded to PSA. It is a full load.
    2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
    Please clear the PSA after the data is loaded to infoobject.
    Assign points if helpful.
    Regards,
    Tej Trivedi

  • Issue when doing Delta loads from PSA to DSO using DTP - Pls help Urgent

    Hi All,
    I have done 3 data loads into PSA and from PSA, iam loading the data into DSO - by splitting that load into 3 using 3 DTPs. (Split that by Fiscal Period).
    2 of the DTP loads are extracting the data from 3 PSA requests.
    But, one of the DTP load is filtering on ONE PSA request.
    So, we are not getting the full data into DSO.
    Can some one pls help me why the DTP load is beheaving like this ???
    Even though i have not given any filters for Request id, still , for one load its picking up the data by filtering on One Req ID.
    Cheers,
    Nisha

    Hi Jr,
    Sorry for late reply.
    I think i found the solution, the diff between the DTP's is i ahve ticked " Get request one after another " .
    I have changed it now and its working fine.
    Thanks,
    Nisha

  • Master Data load does not extract Hierarchy nodes in BPC Dimension ACCOUNT

    Hi Experts,
    I am performing master data load through standard DM package with Filter selection as:
    1. Chart of Accounts
    2. Hieararchy selection has 4 hierarchy names
    3. Selected Import Text nodes
    4. Selected Set Filters by Attribute OR Hierarchies
    I have run this DM package for a set of data and selections a week ago and it worked fine.
    However when i run it now, it is giving issues,
    It extracts any new GL maintained in the BI system however it does not extract any hierarchy nodes at all! (Have tested this by deleting the hierarchy nodes and tried to run the master data load)
    I am running the DM package in Update and have selection as External.
    Any sugestions for checks / has anyone encountered this issue earlier?
    Regards,
    Shweta Salpe

    Hi Guyz,
    Thanks.
    I found that the issue was with the transformation file where i was maintaining the RATETYPE.
    When i removed the mapping of RATETYPE this works fine. (Pulls the nodes of hierarchies)
    however now i do not have Ratetype populated in the system.
    my rate type mapping is:
    RATETYPE=*IF(ID(1:1)=*STR(C) then *STR(TOSKIP);ID(1:1)=*STR(H) then *STR(TOSKIP);ID)
    and in conversion file i have TOSKIP     *skip
    I have to skip the ratetypes for the hierarchy nodes and my hierarchy nodes start with C and H.
    So now that i have removed the mapping for RATETYPE can anyone suggest me a correct way to achieve this? (Note the above mapping formula was skipping all of the hierarchy nodes starting with C and H)
    Regards,
    Shweta Salpe

Maybe you are looking for

  • How do i get my music from my itunes library to my new iphone 5?

    How do i get my music (500 songs) from my itunes library to my new iphone 5?

  • Ipod wont turn on!!!! tried all troubleshooting too!

    ok, i just got my video ipod last night, and it was working fine before i went to bed. i turned the hold switch on after i turned it off, just to make sure it wouldnt come on. when i went to put more songs on it this morning, it wouldn't come on. i r

  • Is it possible to run two appl in a single node

    Hi folks, Plz tell me if it is possible to run two applications in a single node,coz I had deployed one and it is running fine but when I am going to deploy and run another one it is showing that "503 service unavailable" in the browser after the dep

  • Query Editor code creation

    Is there anyway to edit the code that the query editor creates from scratch? For example, I would love to use * instead of the actual field names. I would also like to NOT include the schema name when creating these queries (Just leads to refactoring

  • Modify XAPP1205 with external DVI make synchronous problems

    Hi, I try to modify the Xilinx XAPP1205.  I try to replace one TP Generator with an external DVI signal(AES-FMC-DVI), video pipe 1 in XAPP1205. I can see a picture on Video out, but the synchronous is not ok (flicker on display). * Timing Mode in AXI