Process Chain : Read PSA and Update Data Target in BI 7.0

Hello Experts,
In my Process Chain I have a variant Execute Infopackage to load the data only into PSA. then I have next variant as *Read PSA and Update Data Target*. my data target was a Master Data table Hierarchy
In the variant " Read PSA and Update Data Target". the selection Parameters were
PSA.Table : 0GLACCEXT_T011_HIER_BA1
Selection Object Type : same Infopackage in Varaint Execute Infopackage
the Error message was Variant doesnot contain a valid combination request/data target
I checked the PSA-Table using RSA1OLD Transaction then I find four PSA Tables:
0GLACCEXT_T011_HIER_BA1
0GLACCEXT_T011_HIER_BA2
0GLACCEXT_T011_HIER_BA3
0GLACCEXT_T011_HIER_BA4
All the requests are linked to PSA Table *0GLACCEXT_T011_HIER_BA4*.
I can't understand why there exists four PSA Tables?
I try to link the PSA-table 0GLACCEXT_T011_HIER_BA4 in the variant *Read PSA and Update Data Target and try to save it it raises short dump.
If I select 0GLACCEXT_T011_HIER_BA1 in the varaint I shouldn't face any problems.?
but the process chain raises an Error message Variant doesnot contain a valid combination request/data target with this variant.
How shall I handle this issue?
Please carify me the issues as soon as possible.
thanks in advance
Regards
sailekha

Hello,
In the Variant Read PSA and Update Data Target I have to select  the PSA Tabelle Name from the preselection list in the System it shows me all the four PSA Tables that I mentioned.
I can select any one of them. So I trieed with all the four PSA Tables in the list but I can selct only the XXXX......BA1 table  then only I can save and  include it in the PC. If I select the remaining tables BA2, BA3, BA4.........while I try to save the Variant it raises short dump.
I checked the PSA using the Transaction  RSA1OLD under PSA...... I find 4 PSA Tables( I don't know whether they are PSA Tables or not but in the Variant it states that they are PSA Tables) in which all the requests are attached to BA4. only.
I observed that ony for Hierarchies I find four PSA tables for remaining one only one PSA table was available.
I totally confused how to handle this situation. any suggestions......... please help me to resolve this problem.
Thanks & Regards

Similar Messages

  • BW 3.5 - Update to PSA only / Read PSA and Update Data Target

    Hi Folks,
    I am planning to use on a  BW 3.5 data source and infopackage that loads to PSA only (as the source system job runs quite long and we would run other prepration loads in parallel before pushed to final cube). And then later in the process chain the process Read PSA and Update Data Target to load the data to final cube.
    As we plan to run the process chain 3 times a day I was wondering if the Read PSA and Update Data Target always takes only the latest request loaded to the PSA (as it is always the same infopackage) or do we have to delete the PSA request at the end of each process chain run so that not the next run will load the PSA request again as it is still in the PSA?
    Thanks for all replies in advance,
    Axel

    You are loading the Delta till the PSA, multiple times a day, and after each update to the PSA you will send the data to the Cube.
    Once its there in the PSA, you are performing further tasks & then updating the cube multiple times (once after each delta)
    Well, the 1st load to the cube should be OK.
    But, from the next load onwards, it would pick up all the requests from the PSA & you will have wrong values in the Cube.
    Because, i think, while using 'Read PSA & Update the Target', you can't set it for a FULL or a DELTA load.
    It will bring in everything that exists in the PSA.
    I would advice you to clean the PSA before the starting next Delta.
    But, there is a counter argument.
    If the data mart sign is reflected in the PSA for the request that has been loaded into the cube, then the next time the system will pick up the request which does not have a data mart symbol.
    You can always give this a try in your Dev system.
    Edited by: Vishal Sanghvi on Apr 1, 2011 1:58 PM

  • PSA to Multiple Data Targets in process chain

    Hello All,
    I am trying to create a process chain that loads data from an already loaded PSA to further targets. (Note that the infopackage to load from R/3 to PSA is in another process chain). I am using the "Read PSA and Update Data Target" process type. But it allows only one data target. I need to push the data to all the targets of that infosource. Do I have to put a process for each target? Or is there a work around? Please help.
    Thank you,
    Rinni.

    Hi Hemant,
    Thank you for your detailed answer. I know that it is mostly not practical to have the InfoPackage and "Read from PSA" package in different local chains. However I am facing a situation where I need to create a chain that just extracts all the data from R/3 and loads it till PSA in BW (This is very time critical, so we do not wish to put any other processing in this chain). And in the second chain, we want to put further processing steps within BW. That is the reason, loading from PSA to ODS needs to be in a separate chain.
    Another thing is the variant does have a field to enter Data Target, and that shows only the targets of InfoPackage, but allows me to select only one. However, I will go ahead and try with the Infopkg entry as you suggested.
    But if you could just elaborate on why you said, I CANNOT have PSA to Targets in a separate chain, I would really appreciate it!
    Thanks again,
    it was a big help!!

  • BI process has gotten stuck and the data from CRM hasn't gotten updated

    hai friends am working for BI in CRM....
    BI process has gotten stuck and the data from CRM hasn't gotten updated in the BI cubes.
    can any one help me to solve this issue???
    WHAT MAY BE THE PROBLEM AND WHAT WILL BE THE SOLUTION???
    IS THERE I NEED TO RESTART THE PROCESS CHAINS OR WHAT?

    Is it already scheduled for a daily run or manually you are running the process chain?
    If its manually running then you can select your process chain, and then right click on the start process --> Maintain variant --> Scheduling --> in the scheduler screen , you can click on the Immediate, save the changes and save in the next screen, once it comes back in the main screen, you have a scheduling Icon (Clock idoc). Click on it. Once done then it will be oK.
    Or firs the process only failed, then you can right click on the failed process and then choose "Repeat" option to repeat the  process.
    Also you can check the source system connection is fine or not? RSa1 --> Source system --> CHECK
    Edited by: Murali M on Sep 2, 2010 6:37 AM

  • Creating a external content type for Read and Update data from two tables in sqlserver using sharepoint designer

    Hi
    how to create a external content type for  Read and Update data from two tables in  sqlserver using sharepoint designer 2010
    i created a bcs service using centraladministration site
    i have two tables in sqlserver
    1)Employee
    -empno
    -firstname
    -lastname
    2)EmpDepartment
    -empno
    -deptno
    -location
    i want to just create a list to display employee details from two tables
    empid firstname deptno location
    and same time update  in two tables
    adil

    When I try to create an external content type based on a view (AdventureWorks2012.vSalesPerson) - I can display the data in an external list.  When I attempt to edit it, I get an error:
    External List fails when attached to a SQL view        
    Sorry, something went wrong
    Failed to update a list item for this external list based on the Entity (External Content Type) 'SalesForce' in EntityNamespace 'http://xxxxxxxx'. Details: The query against the database caused an error.
    I can edit the view in SQL Manager, so it seems strange that it fails.
    Any advice would be greatly GREATLY appreciated. 
    Thanks,
    Randy

  • Update data targets from ODS object automatically

    Hello experts,
    I have an ods (that has two delta loading every night) with the setting: "Update data targets from ODS object automatically". This ODS is source of only one other ODS.
    But we have a chain when there are the seguent steps: delta 1 --> ODS1,
    delta 2 --> ODS1; Activate ODS1; delta --> ODS2; Activate ODS2.
    The setting "Update data targets...automatically" could create some problems?
    I think this because it is happened that in delta loading of ODS2, some records have only the key filled, and all the data fields blank; then, if I make a full repair of these records, all data are correctly filled.
    Could be this the cause?
    Help me..it is urgent..
    Thanks,
    Vito

    Hi Vito,
    To have better control of processes when ods gets data from multiple sources and supply it forward to ODS and cube ...we generally deactivate the flag "update data automatically".
    And Process chains looks like ..
    From source 1,2,3 update data into ODS 1.
    Activate Data ODS1
    Initiate delta from ODS1 to ODS2.
    Activate data ODS2.
    We did face some problem of blank records with only key filled and later deactivated the checbox "Update data into data targets automatically" and it resovled lots of issues.
    Hope that helps.
    Regards
    Mr Kapadia
    Assigning points is the way to say thanks in SDN.

  • Split ''Upload from PSA" to separate Data Targets

    Hi,
    I have the following situation:
    1 ODS as datasource
    2 ODS and 1 CUBE as data targets.
    I'm updating my data targets with deltas with one infopackage using process chain.
    As the CUBE is also a data source for another CUBE I want to speed up the upload process by separating the load to data targets.
    So:
    data from 1st ODS to CUBE 1 as first step then I can process data to CUBE 2 with different process chain;
    2nd step would be upload data again form 1st ODS to two remaining ODSs.
    The point is that I'm using one info package and cannot split loads per data targets. I tried to create two infopackages: one with CUBE 1 as data target and second with my ODSs as data targets. But the second infopackage doesn't return delta any more.
    Is there any solution for such a case?

    Thanks Arun for your input, the problem is that I don't want to create new transfer rules or update rules or any other objects. Only question is, if data from PSA can be loaded to the specified target with use of process chain. I know I can use the specified request to upload data targets separately but it involves manual work and I need to use info package variants.
    Message was edited by: A. Pazucha

  • HOW TO CREATE PROCESS CHAIN FOR PSA ARCHIVING

    SDN EXPERTS,
    i like to create process chain for archiving. but i dont see any event for archiving when i go to create a process chain for PSA archiving. please let me know.
    thanks

    Hi,
    In the below document on the 14th page it says it is not possible to archive the master data & PSA.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f47fec9f-0501-0010-ce84-8a7231715ae6
    As per my understanding..
    But one thing you can do is use another ODS and load the same data that you're loading to something else and try to archive that data(Only if you have transformations in the current CUBE/ODS).
    Correct me if i was wrong and eloborate your requirement.
    Regards, Siva

  • Process Chain running even after end date

    Hi,
    We are scheduling month end PC to run every 2 hours and also we have given end date for today evening.but process chain runs even after end date for every 2 hours.so we have to manually remove it from scheduling.
    Any method how can PC stop on its own after end date .
    Any help highly APPRECIATED
    Thanks
    Nilesh Pathak

    There might be multiple release jobs for thie process chain.
    So check the same. Right click on process chain --> Display scheduled jobs --> Here all the jobs logs of the chain will be available.. So check the same here...
    As you have descheduled it now.. probably you may not be having it as the first release job might be moved to Schedule status as soon as the END date has reached and then second one might have gone when you have removed the chain from Schduling..
    Anyway check the same.. and also check in SM37 whether any other user has triggered it manually today or not?

  • InfoPackage don't update Data Target

    I Have a general problem that InfoPacjage didn't update any data target especially InfoCube
    I am using the Data Flow 3X
    When I am using DTP instead I succesfuly update data target
    I need to know why InfoPackage didn't update data target?
    Edited by: Mohamed Anas on May 29, 2011 9:36 PM

    hi Mohamed,
    Make sure your infopackage data target is not PSA only, there are multiple selection like PSA, Direct to Data Target.. sequencially and paralller. If that is not the issue, is your datasource emmulated DS? or it is 3.X? make sure you use the correct datasource type.. you can change that in rsds tc. Still that is not the issue, then tell us if  you are getting any error? is it specific to one datasource or to others as well? try to do it in background instead of dialog. Hope that helps.
    thanks.
    Wond

  • Process Chain Failed at "AND" stage!!!!

    Hi Experts,
    Could any one guide me the steps to rectify the Process chain failure at AND stage???
    Steps:
    1.Start
    2.Data loads(9)
    3.AND(failed here)
    Error message:This AND process is not waiting for event RSPROCESS, parameter 3VW29DJ1PBX67PFJHD5IAEYI1     E

    We encountered a similar error.  Actually it wasn't an error, the process chain just stopped at the AND process and did not continue.  We are on BW 3.5.  I believe there is are OSS Notes for this.  See 1005481 or 1003941.
    If this is not the case, try to reschedule the process chain.  To remove from scheduling:
    1.  select Process Chain -- Remove from Scheduling.
    2.  Activate and Schedule process chain
    We did not install the notes (above) and rescheduling did not work for us.  We ended up changing and tranporting the process chain again and we were fine.
    Hope this helps.

  • How to get new and updated data into LO Excel in Xcelsius

    Dear Experts,
    I have created dashboard on top of webi report by using Live-Office connection. Latest data of webi report is imported into excel and mapped data with components and generated SWF file and exported into server.
    To day my webi report has latest instance with new and updated data. But until unless by clicking "Refresh All Objects" i am not getting updated data into excel.
    When i am trying to open dashboard in BI Launch Pad/CMC it is showing data whatever exist in excel(i.e yesterday data). But here we need to show data of latest instance of webi report.(i.e New and updated data as of now).
    I have selected option "Latest instance: From latest instance scheduled by" in "refresh options".
    My Question & Doubts:
    1) Is it mandatory to open dashboard every day and need to click on "Refresh All Objects" to get updated data into excel or dashboard.
    2) Is there any option to automate this process.
    Regards,
    PRK.

    Hi,
    Schedule the webi report to get the latest data from the source. To answer your query no is doesn't require to open the dashboard every time to refresh the excel to get the latest data.
    Please use the Refresh Before Components are Loaded: Select this option to refresh the data each time the model loads and to use that data as the initial data for the model (using a Reset Button component, it will reset the data to the values from the last time the model was loaded).
    You are using the Live Office  so here automatic refresh is not possible without touch the swf file, you need to use the refresh but to get the latest data. If you are using QAAWS, Web Service & XML then automatic refresh is possible.
    For more information please check the below document for in-depth idea on the design pattern.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b02e31fb-3568-2e10-e78f-92412c3c0a96?overridelayout=t…
    Kindly revert for more clarification!!!
    --SumanT

  • Connecting to datasource and retrieve, insert and update data in SQL Server

    hi,
    i am trying to retrieve, insert and update data from SQL Server 2000 and display in JSPDynPage and is for Portal Application. I have already created datasource in visual composer. Is there any sample codes for mi to use it as reference???
    Thanks
    Regards,
    shixuan

    Hi,
    See this link
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/6209b52e-0401-0010-6a9f-d40ec3a09424
    Regards,
    Senthil kumar K.

  • How to know the process chain start time and end time

    Hi Experts,
                   How to know the process chain start time  and end time .
    Thanks in advance
    Regards
    Gutti
    Edited by: guttireddy on Feb 23, 2012 11:30 PM

    Hi Reddy,
    You may find the run time of a PC using below steps.
    1. Call SE38 > /SSA/BWT > Execute  > Enter your PC , choose the date and time > Execute. Here Run-time of a PC is displayed. (or)
    2. Call RSPC1 > Enter your PC > Execute > Goto Log view > Right click on the start Variant > Displaying Messages > Note down the start time in Chain Tab. Now Right click on the last Process type of the PC > Displaying Messages > Note down the End time in Chain Tab. The Difference b/w start time and end time gives the Run-time of your PC.
    Hope this helps.
    Regards
    Sai

  • What is the best software programs that I can use to read, write and modify data / files on external HD (NTFS format i.e.  Windows) ?

    Hi guys,
    I’m new to Mac and have a MacBook Pro Lion OS (10.6.8 I think !!!) with Parallels 7 (Windows 7) installed. Can someone please tell me what is the best software program that I can use to read, write and modify data / files on external HD (NTFS format) from the Mac OS ? I heard of Paragon and Tuxera NTFS. Are they free ? Are they good ? Are there any other software programs out there ? I heard that some people have issues with Paragon.
    Thanks.

    Your best bet would be to take the drive to the oldest/compatible with that drive Windows PC and grab the files off, right click and format it exFAT (XP users can download exFAT from Microsoft) and then put the files back on.
    Mac's can read and write all Windows files formats except write to NTFS (and in some cases not read) so if you can change the format of the drive to exFAT (all data has to be remove first) then you will have a drive that doesn't require paid third party NTFS software (a license fee goes to Microsoft) for updates.
    Also it's one less hassle to deal with too.
    .Drives, partitions, formatting w/Mac's + PC's

Maybe you are looking for