How to Run ebs adaptor data load for 1 year using DAC

Hi,
iam trying to Run the ebs adaptor data load for 1 year for procurement and spend ,Please let me know the parameter to set in DAC.
last extract date is set as Custom Format(@DAC_SOURCE_PRUNED_REFRESH_TIMESTAMP
Thanks

You need to set $$INITIAL_EXTRACT_DATE to a year ago. The LAST EXTRACT DATE is something used for incremental loads and you do not manually set that.
if this helps, mark as correct or helpful.

Similar Messages

  • How to Create Filter by date Query for access using Data base connectivity tool

    Hello,
    I had started my project by reading Access datas, by using an UDL connection,
    but now i want to create an access filter using a query that would permit to select two dates, and to obtain all corresponding datas, how to do it using the Parmetrized SQL query?
    Best regards,
    Remark: i dont know the SQL language
    ~~~~~~~~~~~~~~~~~~Looking for a LABVIEW JOB (In EUROPE)>~~~~~~~~~~~~~~~~~~
    **The Best Way To Predict**The**Future Is To Invent It**
    Solved!
    Go to Solution.

    Hi mike,
    I want to thank you for your help, i really appreciate it,
    i tried to make the code as ur picture, i had no problem to execute it, but i have no data's in the output
    i changed the date format, because i suppose that its a short date format in the database.
    can you just look at it, and tell me if you can find the problem,
    thank you again for your help
    Best regards,
    ~~~~~~~~~~~~~~~~~~Looking for a LABVIEW JOB (In EUROPE)>~~~~~~~~~~~~~~~~~~
    **The Best Way To Predict**The**Future Is To Invent It**
    Attachments:
    db.JPG ‏126 KB
    lav.JPG ‏82 KB

  • Need to skip the data load for one of the sub chain.

    Hi All,
    In our project, we have one meta chain and have many sub chains inside that. Today we don't want to run one of the sub chain. As we are in BW7.0, Skip options are not available. Can you please help how to stop the data load for the particular sub-chain.Please suggest.
    Thanks.

    Hi Jalina,
    If this is a frequent request then you can create a custom ABAP Program and then use the below simple logic to skip/run the meta chain, but you will also have to change the link between the sub chains as "Always" instead of "Successful" so that no matter if the dependent chain is successful OR not the nex chain will proceed. If you are comfortable with event then you can also use event to achieve this
    Program Logic- Create a new table where you maintain meaningful values (indicator/description) in it and the program should read the data from the table, Based on the values in the table the program will be successful/fail
    Thanks
    Abhishek Shanbhogue

  • How we can automate the data loading from BI-BPC

    Dear  Guru's
    Thanks for watching this thread,my question is
                  How we can load the data from BI7.0 to BPC.My environment is SAP-BI 7.0 and BPC is 7.5 MS version and 2008SQL.
    How we can automate the data loading from  BI- BPC Ms version.Is manual flat file load is mandatory in ms version.
    Thanks in Advance,
    Srinivasan.

    Here are some options
    1) Use standars packages and schedule them :
        A) Openhub masterdata file into a flat file/ BPC App server  and Schedule the package - Import Master Data from a Data File and  other relevent packages.
    2 ) Using Custom Tasks in Custom Packages ( SSIS)
    Procedure
    From the Microsoft SQL Server Business Intelligence Developer Studio, open the Microsoft SSIS folder.
    Create a new package, or select an existing package to modify.
    Choose  Task  Register Custom Task .
    In the Task Location field, browse for the target .dll file.
    Note
    By default, the .dll files are stored in BPC/Websrvr/bin.
    End of the note.
    Enter a task description, select an appropriate icon, then click OK.
    Drag the icon to the designer window. Enter data as required.
    Save the package.

  • Master Data Loading for Prices and Conditions in CRM - "/SAPCND/GCM"

    Hi,
    Could anyone give me some inputs on Master Data Loading for Prices and Conditions in CRM.
    T. Code is:  /SAPCND/GCM
    I need to load data on a file (extracted from 4.6) for service contracts.
    I tried LSMW : for this transaction, recording does not work.
    I am trying loading thru Idocs (LSMW). But that too is note really working.
    Do we require some custom development for this , or is some SAP standard funcntionality available ??
    Can anyone provide some valuable inputs one this.
    Would appreciate your responses.

    Hi Tiest,
    Thanx for responding.
    U r right, our clint is upgrading from 4.6 to ECC.
    So as per the clients requirements, we are maintaining all the configs for Services in CRM.
    Services Data which was in 4.6 is being pulled put on flat files which needs to be loaded in CRM. So middleware would not be able to do this.
    What I am looking os some standard upload program.
    LSMW recording does not work.
    This I-Doc "CRMXIF_COND_REC_SLIM_SAVE_M", i am able to load a single record. But I am not able to find , how to make this function for multiple entries.
    IN standard we for loading master data thru I-docs, we map the values to the standard fields which are available in that I-Doc.
    But in this particular i-doc, there is a common field for which I need to define the field name and a field value..
    Till now, I am only able to define just one field name and a field value.
    I want this to word for mutliple entries.
    Hope u get my point.
    Thanx

  • How to add a new data element for existing table filed(Primary key field)

    Hi Experts,
    How to add a new data element for existing table field(Primary key field)
    For this filed ther is no foreign key relation ships and even check table.
    while activating table it is giving message like below.
    can you help any one to solve this and wil steps to add new dataelement for existing primary key filed of a table.
    Check table (NAMING SPACE/TABLE NAME(EX:/TC/VENDOR)) (username/19.02.10/03:29)           
    Primary key change not permitted for value table /TC/VENDOR
    Check on table  /TC/VENDOR resulted in errors              
    Thanks
    Ravi

    Hi,
    Easiest way is to download the table eg into an Excel table (if possible) or text table. Drop the table from the database. Build your table with the new key field. Build the database table again and fill it.
    You can do it also over the database into a new table. Drop the old one. Build the enhanced one and fill it. Afterwards drop your (temporary) table.
    Maybe there are other ways, but this works.
    Success,
    Rob

  • How do I get a data plan for iPad

    How do I get a data plan for my ipad

    Check with your wireless carrier to see if they offer a MiFi device.
    http://en.wikipedia.org/wiki/MiFi

  • Master data loads for Attributes and texts failing

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

  • How to run the Transaction Code - BIC for Bank keys uploading

    Hi
    How to run the Transaction Code - BIC for Bank keys uploading.
    please give me suggestion.
    Thanks
    Indu

    Go to BIC transaction screen
    Fill the selection screen parameters as below
    1. Update Run (Test Run: Check Off: Real Run:Check On)
    2. Set Deletion Flag (Check On)
    3. Maximum no. of records: 999999
    4. Detail List (Check On)
    5. Display variant: 1SAP
    6. Presentation server (Select Radio Button)
    7. Application server (Deselcet Radio button)
    8. File name and Path: Select relavant file which is to be upload
    9. Bank country: If needed (Give respective country name)
    10. Select execute button or F8
    NOTE: Make sure that before executing run execute TEST RUN by deselecting Update Run check box
    Hope this will meet your requirement
    Thanks.

  • Master Data Load for New Attribute

    Hi Users,
    We had to implement a separate load flow for a new field coming from R3. This field was to be added to existing master data object.
    I added a new Display attribute for an existing 0GL_ACCOUNT master data object.
    This new attribute along with some other existing fields is getting data from another master data object with an infosource in between because two transformations cannot be created for same source and target.
    When i load the data i dont see data being populated for this new field. I did ACR, checked the keys e.t.c.
    Source object has data but after executing DTP no data comes to this attribute. No routines or anything.
    Please suggest
    Regards
    Zabi

    Hi,
    The situation is:
    Field x from source maps to
    1. field y ( which was existing field ) and also maps to
    2. field z which is the new attribute.
    field y has to get updated for company codes 10 for example.
    field z for company codes 30.
    now if i use same flow and map field x to both y and z then there is overwriting happening if 10 does not have value for x and 20 has then its not good.
    So if i use a separate flow with infosource then i will map only x to z so after loads which means for 10 code if no value went in first dtp to y then if code 30 has value for x then z will only be updated and y remains empty....
    Master Data Load for New Attribute 

  • Master data Load for Substance Creation :  Using BAPI in LSMW  ?

    Hi ,
    Had any one did a Master data load for creating a Substance in SAP RM. If so please let me know the best possible option. I am beeing trying to figure out the option to load Substances and its IDENTIFIER / MATERIAL ASSINGMNET from my legacy system. But I am not geeting a clue to use LSMW. ( Direct or BatchInput )
    I am not seeing any BAPI for uploading it through the LSMW except the one BUS1077 ( method SAVEREPMUL ) and I guess this is used along with IDOC.
    Appreciate your help and suggestion.
    David

    Thanks John.
    I will try this option what you have suggested. Even I told by business team to look for CG33. But they were been telling me some format issue. Probably they would meant the same what you have said. But they were not aware of work around for this. your information is intresting.
    The other option given to me was to create a recording for each and every Tab like Substance header / Identifier / Material assignment etc. seperately. Technically I don't feel good to create in that way. Do you think that is alernate approach.?
    With Respect & Regards
    David

  • Master data load for 0COSTCENTER (Cost Center) failing

    Hi Experts
    I have a master data load for 0COSTCENTER (Cost Center). The Load has started failing from a couple of days at DTP.
    (R) Filter Out New Records with the Same Key
          (G) Starting Processing...
          (R) Dump: ABAP/4 processor: DBIF_RSQL_SQL_ERROR
    I am unable to understand the reason for this failure. I tried loading the data 1 costcenter at a time it is still failing so i doubt if its a internal table storage issue as suggested by the dump.
    Could you help me on this one please.
    Regards
    Akshay Chonkar

    Thanks All
    I have got the issue resolved
    The Error Stack for the DTP had accumulated so many entries that it was unable to process further.
    So i deleted the entries in Error Stack usin SE14. Then again executed my DTP.
    Everything is fine now thanks for your help.
    Regards
    Akshay Chonkar

  • Data load for VK11

    Hi All,
    can anyone suggest me about data load for VK11, I mean which method will be easier?
    appreciate if you can send me the developed custom code
    thanks.

    Hi Kiran,
    You can also use BAPI BAPI_PRICES_CONDITIONS.
    Please check this link for sample codes.
    Re: Sample code for  BAPI_PRICES_CONDITIONS
    Hope this will help.
    Regards,
    Ferry Lianto

  • How does one change the date format for PlayMemories Home folders?

    I am using PlayMemories Home Version 2.0.00.11271 and have many folders within it which contain only photographs imported from my Sony DSC-H9.  Unfortunately, PlayMemories Home insists upon dating all folders in a month/day/year manner; and this makes little sense if one wishes to have the folders listed in logical chronological order.  How can I change the date format to year/month/day?  Changing the name of each folder, one by one, will take a very long time!  Your helpful advice in this matter will be greatly appreciated.  Thank you. System information:
    Operating system: Microsoft Windows XP Professional
    Service pack : Service Pack 3
    Memory: 1.5 GB
    Processor:         Intel(R) Pentium(R) M processor 1.86GHz
    Max. clock speed: 1.86
    Manufacturer: IBM
    Model: 1847W76
    System language setting: English (United States)
    User language setting: English (United States) 

    I too thought that the folder naming format was obviously wrong and couldn't find a way to change it. I do agree that placing photos in folders according to when they were taken is a great idea. I had been considering writing some software to do just that. After discovering that PlayMemories does it, I had it re-import all my photos. Then I wrote a small Perl script to rename all of the folders into year-month-day format. Included here is the Perl script. It only acts on a single folder - use it on the root folder where all the PlayMemories folders are. It will rename all folders currently in a month-day-year format. I used it without problems but of course there is no guarentee that it is error free. This should work with any common version of Perl. Tom # There should be one command line argument: the directory to act upon
    if ( scalar(@ARGV) == 0 ){
     print "Usage: RenameDirs <dir>\n";
     print " Where dir is the directory containing the directories to rename.\n";
     exit;
    $mydir = $ARGV[0];
    chdir $mydir or die "Couldn't chdir to $mydir: $!";
    opendir(ROOTPHOTODIR, ".") or die "Failed to open the pictures directory $mydir: $!";
    @allphotodirs = readdir ROOTPHOTODIR;
    closedir ROOTPHOTODIR;
    foreach $dir (@allphotodirs) {
     if ( -d $dir ) {
      print "$dir is a directory";
      if ( $dir =~ /^(\d{1,2})-(\d{1,2})-(\d{4})$/ ) {
       print " and has the proper format: month $1 day $2 year $3 and will be renamed to ";
       $newname = sprintf "%4u-%02u-%02u", $3, $1, $2;
       print "$newname\n";
       rename $dir, $newname or die "failed to rename $dir to $newname: $!";
      else {
       print " but is not of the proper format\n"
     elsif ( -f $dir ) {
      print "$dir is a file\n";
     else {
      print "$dir is neither a directory nor a file\n";
    }

  • IU Elim. "No data found for processing using current selection conditions""

    Dear Experts,
    While Executing task of Interunit elimination  in Consolidation Montior  I am getting Message "No data found for processing using current selection conditions"
    Ex. is
    A)
    In Unit X
    GL (399999) Account     Dr. 65000 (Customer Recon. Ac.)   (with Trading Parter X)
    GL (499999) Rev.A/c           65000 (with Trading Parter X)
    In Unit Y
    GL (199999) Exp. A/c     Dr. 65000   (with Trading Parter Y)
    GL (299999).Account           65000 (Vendor Recon. Ac.) (with Trading Parter Y)
    B) GLs in info cube in 0FIGL_C01 are :-
    GL Account---CCode--Trading PartnerDebit--
    Credit
    199999--YY65000-----00000
    299999--YY00000-----65000
    399999--XX65000-----00000
    499999--XX00000-----65000
    In COnsolidation WorkBench
    1) I have created Document Type
    2) Method-
      In  General Tab
       a) Two SIded Selection
        b) Per Transaction Currency Selected
    In Selection Tab
    1St Selection
    GL Account = 299999 (Customer Recon. Account)
    Company    = X
    Trading Partner = X
    1St Selection
    GL Account = 399999 (Vendor Recon. Account)
    Company    = Y
    Trading Partner = Y
    Difference Tab
    a) Post Diff to "Unit from Selection 1"
    b) Key Figure "Period Value GC
    c) Check Limit Per Difference Row
    Other Differnce
    GL Account = 100099 (Other GL)
    Currencyce Diff
    GL Account = 100510 (Other GL)
    Question :-
    1) Is the posting is appropriate and does it attracts IU Elimination?
    2) The Infocube Details are correct?
    3) Any Config issue
    May any one suggest, Why Am I not able to get the data?
    Thanks
    Rakesh Shrivastav

    Dear Sir,
    Following are the View at my end in context to your suggestion
    1. check the BCS totals data to ensure trading partner is included.
    THis is the view of Source Info Cube 0FIGL_C01
    GL Account---CCode--Trading PartnerDebit--
    Credit
    199999--YY65000-----00000
    299999--YY00000-----65000
    399999--XX65000-----00000
    499999--XX00000-----65000
    2. Execute the task for the cons group that includes both cons units X and Y
    The COns Group Is XYZ
    X- Cons Unit
    Y- Cons Unit
    In Cons Monitor I am executing Test run at XYZ level
    3. Although the trading partner for cons unit X should be Y and vice versa, the elimination should still occur with the cons unit X and trading partner X records.
    Same as the query description
    4. make sure that the items 199999, 299999, 399999 and 499999 are included in the elimination method for either selection 1 or selection 2.
    The Method Selection Tab View is
    1St Selection
    GL Account = 299999,199999
    Company = Y
    TP = Y
    2Nd Selection
    GL Account = 399999,499999
    Company = X
    TP = X
    What is your View On That?

Maybe you are looking for