Abaproutine in infopackage

Hi
How do i write my abapcode in infopackage in this scenario
1. have an table in bw (custom) with some avarage manufactoring ordernumber for each period
2. when loading from the standard infosource up to copc infoprovider i just want to load the manufacoring orders found in that table
3. I can see that it is possible to write abapcode for the manufacuring order selectionj field in the infopackage
I just want to via abap select the actual orders from my table into the selection for manufactoring order
can someone help me with this cod ?
// S

Hi Manfred
Yes i can do that.  But i am going to load about 10 % of all data so insted of read all and delete 90 % of the records i think its better to directly extract  the 10 % of data
Do you think it is possible to do it my way ?
// S

Similar Messages

  • Loading Infopackages one at a time (i.e. one by one)

    Dear All BI Experts,
    We are implementing SCM5.1 with BI7.0
    We load a cube in BI with daily forecast data from APO, but unfortunately we have spotted one of the dates is wrong, so the data in the cube has been summing up incorrectly.
    That has now been fixed via the transformation, so what I want to do is reload the cube but with one Infopackage at a time (not all eight as are queued up now).
    Does anyone know how to load packages one by one?
    Any help will be greatfully received.
    Nigel Blanchard

    Hi Nigel Blanchard
    There is an ancient solution i am sorry if my mind working as ancient guy. But let me share my though here
    As i suggested Just run the DTP between Data source and Cube with Get request by request delta option with below process
    Say suppose you have 18 requets in PSA
    when you run the DTP first time its trying to fetch the records one by one till 18 th so you can manually red teh process and delete the unwanted requests and then change the transformations date related info and then again run the DTP this time it will try to fetch the request from 2-18 so after loading of second request break the load by changing the QM ststus of DTP if its not possible then go to the target cube mange from there stop the data load then reomev the requests which ever present other than 2 remove 3-18
    and then again repeat the process
    318   break the load delete the requests 418
    4--18   "            "             "      "  
    Repeat the process till all the requests got finished... and finally cross verify weather all the requests got loaded or not..
    Hope its clear a little..!
    Thanks
    K M R
    ***Even if you have nothing, you can get anything.
    But your attitude & approach should be positive..!****
    >
    Nigel Blanchard wrote:
    > Dear Respondants so far,
    >
    > At least KMR is the closest!
    >
    > Geo - Your comment doesn't really help me.
    >
    > Jagadish - Yes you can move data from a data source or PSA straight into a Cube. You do not need a DSO. So I do mean infopackages.
    >
    > KMR - You seem to have got the idea of what I am doing.
    > Forecast data is created in APO and saved into a DSO.
    > We have built an extract datasource in BI that fetches this data daily to a datasource on BI.
    > We then load this data directly into a cube. We have no need for a DSO as no transformation takes place (the DSO in APO essentially becomes our corporate memory anyway).
    >
    > So the problem is this;
    >
    > Initially, moving the data from the datasource to the cube has gone wrong due to the date problem. We have now fixed this in the transformation between the cube and the datasource.
    > To reload the data back into the cube, I need to load one package at a time. After each package I need to change the transformation to reset this date. So, that is what I mean when I say I need to load one package at a time.
    > The process will look something like; Set date on transformation->Load First package->reset date on transformation->Load Second package->reset date on transformation->Load Third package and so on until I catch up with today's load (about 8 days worth). After that I can just use the 'Current Date' function to automatically fill in the correct date. So the sooner I do it the less manual intervention required!
    >
    > You are right about the DTP to delta the data into the cube request by request, but this does not provide me with the break between requests that I need to manually reset the date in the transformation. This option will run all of them one after the other without stopping. Is there no way I can just specifiy the request I want to load by request number or an option to simply load one request, so I get the break in between loads that I need?
    >
    > Many thanks for your time and help so far.
    >
    > Nigel.
    Edited by: K M R on Feb 16, 2009 4:30 PM

  • Error while Scheduling InfoPackage

    When i am scheduling a full load in the infopackage to load data to PSA but am getting an DBIF_RSQL_INVALID_REQUEST error. The desciption is "An invalid request was made to the SAP database interface in a statement  in which the table "EDID4 " was accessed".
    Can anyone help me overcome this and resolve the issue, please.
    Thanks
    Neha

    check OSS Note 89384 if it is relevant to ur issue.

  • DataSource 0CRM_SRV_PROCESS_H is not getting the data in infopackage

    Hi Masters,
    I have to upload the data from the datasources 0CRM_SRV_PROCESS_H and 0CRM_SRV_PROCESS_I into Infopackages, but when i execute the infockage it shows
    No data available
    Diagnosis
    The data request was a full update.
    In this case, the corresponding table in the source system does not
    contain any data.
    System Response
    Info IDoc received with status 8.
    Procedure
    Check the data basis in the source system.
    In CRM system when i checked it in RSA3 it is showing the data.
    I checked the older posts but i havn't find any answer post. I also checked the SAP note 692195, but it is applicable for 4.0. I am working on 7.0.
    Please suggest something so that I can get the data into infopackage.
    Thanks and Regards,
    Vicky.

    Hi Lokesh,
    Thanks for your reply.
    Actually i am working on such a project where they don't give SAP_ALL authorization to any user. If we need any authorization we need to show them particular SAP note of same version in my case it is CRM 7.0. If you know any note on CRM 7.0 which gives all the authorization objects required or it says we need SAP_ALL authorization please tell me.
    It will be very help full for me to get the authorization.
    Regards,
    Vicky.

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • BI 7.0 Urgent:  unable to locate an InfoPackage in any view

    In BW 3.x, we can locate an InfoPackge which loads data from one ODS to another ODS in DM (Data Mart) view of InfoSource view.  But on BI7.0, in the DM (Data Mart) view, we can not locate this InfoPackage and we can only locate this InfoPackage in Process Chain.  Could someone let us know the trick on how to locate such InfoPackge in a view on BI 7.0?
    Thanks

    Hi Kevin,
    I don't exactly understand what do you mean by view? But what I understand from your question is - you want to know if there is any place where we can find the infopackages without using the technical name?
    If that is what you are looking for then you can drill down the related infoprovider data flow (which was the nice feature introduced by SAP in BI) and look for your infopackages. Just keep in mind that sometime you don't find the infopackages even you drill down through the infoprovider coz you need to check from which source system you are extracting data. In the infoprovider you need to change the source system and then drill down and see with the description.
    Hope this helps,
    Bye...

  • Infopackage Idocs in status 2 - could not find code page for receiver system

    Hi,
    We just migrated our production system from BW 7.01 non unicode to BW 7.4 on HANA.
    We now encounter issues with idocs while loading data into bw from our ECC5 source. When we analyze idocs in the source system it appears with the message "could not find code page for receiver system"
    One weird thing is that the idoc seems to have been created before we started the infopackage in bw.. We controlled system time and AS time and everything seems ok.
    We did not encounter this issues on our previous migration test runs..
    Hope someone can help
    Christophe

    Hi,
    Thanks for responding. We finally found out what the problem was.
    We have two applications servers on our ECC with 2 different OS. One of them could not reach the new BW HANA server.
    Regards
    Christophe

  • Update data from ODS to ODS with infopackage selection

    Hi,
    I am trying to update data from one ODS to another ODS with selection criteria in InfoPackage which is created manually.For Full load I can give selection criteria in InfoPackage. When I initialize data Selection is greyed out even selections for Full load exists. Please advise me how to give selections for delta loads from ODS to ODS loads.
    Thanks in advance.
    Ram

    Once you started an ODS as destination in FULL mode from a DS you cannot get back.
    So if you want to update from ODS to ODS using Change Log but considering only some data records you could create an Update Routine with a Start Routine that DELETES undesired records (e.g. DELETE DATA_PACKAGE WHERE ...) and then start an Init-Delta Loading.
    Hope it helps
    GFV

  • Scheduling InfoPackage fails to start extract in R/3

    Hi,
    We are extracting data from CO-PA using the delta infosource.
    We initialised the Delta successfully (without data transfer) and have now scheduled our infopackage to extract data from R/3.
    When we go to monitor, it tells us that the "Request is still running". Within Details:
    Requests: Everything OK
    Extraction: Missing messages
    Transfer: Missing messages or data
    Processing: no data
    When I go to the R/3, SM37, I see that infact no extraction job has been scheduled. Consequently, the InfoPackage sits in a "Still running" status till it times out.
    If I go to iDoc list in R/3 (Transaction WE05), I see within the inbox that there is a iDoc from my BW system of type "RSRQST" (BIW: Data Request message to OLTP) with a status of 64 ("iDoc ready to be transferred to application").
    It seems that something is preventing this iDoc from initiating the extract.
    Can anyone suggest a cause of this problem?
    Thanks
    Leanne

    hi,
    how about the other infopackage extraction ?
    try to check following from monitoring of the request, menu Environment
    - Transaction RFC >> in the datawarehouse
      Transaction RFC >> in the source system
    - ALE management >> in the datawarehouse
      ALE management >> in the source system
    - Check connection
    - RFC log on

  • Multiple InfoPackages loads to a cube

    Can I load data from a number of InfoPAckages into an InfoCube in parallel, or is it best not to? Thanks

    Hi Niten,
    in addition to all the hints provided already, I'd like to mention:
    depending on your system settings (number of parallel processes ...), it would be critical to check if and how many data loading processes are being scheduled on the same box in same time window. if you schedule too many info packages for your processes, the system (dialog and batch processes) could be blocked and no other jobs could be started.
    hope it helps.
    Lilly

  • Infopackage-Load Many Files from Application Server and later Archive/Move

    Hi All..
      I have a doubt,   I have a requirement of take many files to load into BI 7.0..  I used the infopackage before with option:
    Load Binary File From Application server
      I load information successfully... only with one file ...but If I can load many files (with different names) like the next list.. I think it's not a good idea modify the file name (path) on infopackage each time).. :
    *All of this files will be on one server that itu2019s map into AL11.. Like
    Infopfw
    BW_LOAD_20090120.txt
    BW_LOAD_20090125.txt
    BW_LOAD_OTHER_1.txt
    u2026.
    Etc..
    This directory it's not in BW server.. It's other server..but I can load form this location (one file by one)
    Could you help me with this questions:
    -     How can I Use an infopackage with routine that take all the files..one by oneu2026 in order of creation dateu2026and load into Target? Is it possible?.. I have some knowledge of ABAP.. but I don´t know exactly how I can say to system this logicu2026
    -     In addition is it possible move this files to other locationu2026 like into Infopfwarchive u2026 just to have an history of files loaded.
    I saw that in infopackage you have an option to create a routine.. in ABAP codeu2026 Iu2019m a little bit confused because I donu2019t  know how I can specify all the path..
    I try with:
    Infopfw
    InfopfwFile.csv
    Infopfw
    This is the abap code that automatically you see and you need to modifyu2026
    Create a routine for file name
    This routine will be called by the adapter,
    when the infopackage is executed.
              p_filename =
              p_subrc = 0.
    Thank you for your ideas or recommendations.
    Al

    Hi Reddy, thank you for your answer
    I have some doubuts.. when you explain me the option:
    All the above files are appending dates at the end of the file....
    You can load the files through infopackage by using Routines and pick the files based on date at the end of the file..***
    I need to ask you if you think that when you know the date of the file and the infopackage pick each file... thi can work for many files??... or how it's possible control this process?
    About this option, I want to ask you If when you menction Unix code... where it's programed this code?.. in the routine of BW Infopackage??
    ****Or
    Create two folders in your BW in Application server level, in AL11 (ask Basis team)
    I call it is F1 and F2 folders.
    First dump the files into F1 I assume that the file name in F1 is "BW_LOAD_20090120.txt", using Unix code you rename the file and then keep in the same foleder F1 or move to F2.
    Then create InfoPackage and fix the file name (i.e. you renamed), so you don't need to change everyday your file name at infopackage level.Because in AL11 everyday the file are overwrite.
    To I get BW_LOAD_20090120.txt file in F1, then I renamed to BW_LOAD.txt and loaded into BW, then tomorrow I get BW_LOAD_20090125.txt in F1, then I renamed to BW_LOAD.txt....
    so in this way it will work.You need to schedule the Ubix script in AL11.
    This is the way how to handle the application server...I'm using the same logic.
    Thank you soo much.
    Al

  • Bw report to show all error message from an infopackage

    Dears Experts,
    I am looking for a way to analyze within a BW reports (a specific one or already existing in a BI content) all errors messages generated by an infopackage and stored within the status field of a PSA. Does someone have any feedbacks or ideas on that?
    Thanks in advance,
    LL

    Thanks Sam,
    And how to retrieved detailled message regarding each wrong status?
    LL

  • Process chain and Infopackage?

    Hi,
    I am creating the process chain in that I am completely delting the data contents from the Cube and then I am loading data(this is our requirment).There is an error
    A type "Complete Deletion of Data Target Contents" process cannot follow process "Execute InfoPackage" var.ZPAK_11VS7H4SRVUOPFQDDJO9V7SSG in the chain
    Please help me to resolve this.
    <removed by moderator>
    Thanks,
    Vasu.
    Edited by: Siegfried Szameitat on Jan 26, 2009 1:23 PM

    Hi Vasu
    It seems that BW doesn't want to have an Infopackage and then a Deletion, it may consider that you can't do it even if the loading is not the same.
    You should add another step between, like Indexes for your infocube ...
    Kind Regards
    Mickael

  • Infopackage and process chain status

    Hello expert,
    I am new in BW technology and I need some help...
    Well I have created a process chain in order to load data into infoprovider. This process chain is running everyday in production system, (it's a tipical PC: delete PSA request, run infopackage, run DTP....)
    I have checked the last status of the process chain and it's yellow since two days ago... It was due a communication error between ERP and BW system, therefore (for some reason), the infopackages started but never ended (always yellow status).
    So, my questions are:
    1.- Is there any configuration for the infopackage to prevent yellow status (for more than X time)?
    2.- Could we control it in a process chain?
    Thank you and best regards

    Hi,
    We can configure for infopackage incase of no records found however you can setup for time linmit.
    You can set the status the request is to receive per InfoPackage in the Scheduler, via Scheduler   Warning Handling. This setting overrides the system-wide setting.
    Check the below link for more info :
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a65cfe07211d2acb80000e829fbfe/frameset.htm
    Regards,
    Satya

  • Loading in infopackage takes lot of time

    Hi Friends,
    When i schedule and activate the process chain everyday, it takes lot of time to get loaded in the infopackages, around 5 to 6 hours.
    in st13 when i click on the Log id name to see the process chain.
    in the LOAD DATA infopackage it shows green, after double clicking on that, process moniter
    i see request still running and its yellow but 0 from 0 records.
    and this will be the same for hours
    Its very slow.
    need your guidence on this
    Thanks for your help.

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    Also see if there is any 'sysfail' for any datapacket in SM37.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
    5) Check in SM58 and BD87 for pending tRFCs and IDOCS.
    Once you identify you can rectify the error.
    If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the data tables.
    SM21 - System log can also be helpful.
    Regards,
    Habeeb

Maybe you are looking for

  • Set Microsoft Outlook as the default mail client.

    Every time I am trying to sync my calendar, notes and mail account with my iPhone through iTunes I keep getting a notifactions saying 'Either there is no default mail client or the current mail client cannot fulfill the messaging request. Please run

  • [SOLVED] l2tp-ipsec-vpn-daemon from AUR fails to build

    Please let me know if there are other details that require posting:- ==> Starting build()... /usr/bin/qmake -o qttmp-Release.mk -after "OBJECTS_DIR=build/Release" "DESTDIR=dist/Release" nbproject/qt-Release.pro mv -f qttmp-Release.mk nbproject/qt-Rel

  • Video on iPod - won't play

    I got my videos onto myiPod. They show (and play) under music (no video). No videos are listed under "Video". Under "About" it shows the iPod with the right number of videos. I have the up-to-date software. How do I get them to play on my iPod? Sony

  • Xntpd on Solaris 10 gets synchronised with a higher stratum NTP server

    The Solaris 10 machine has the following configuration in etc/inet/ntp.conf server 10.24.179.33 prefer server 127.127.1.0 fudge 127.127.1.0 stratum 8 The Linux Fedora Core 4 test machine(10.24.179.33 ) configured as NTP server running ntpd V4 has the

  • Initialization tasks in a cluster?

    Hello!           Do you know of a way to initialize ressources (like an external           scheduler or some initial database calls) in an application deployed in           a cluster?           My current approach (which fails for clustered deploymen