Extractors for Archive files & online data

Hi experts,
I have requirements to develop extractors to extract both ADK files and online data for MM and PS. Online PS data were extracted previously through generic extractors with FM (F2) and MM with generic extrator with View
I identified the table MSEG, MKPF for MM and MSEG for PS were the data need to be pull from. The ADK files reside in a file system and are accessible through PBS index.
1- Could you pl guide me through a strategy how to proceed?
2- Is possible to develop a generic datasource with function module that with be able to fetch ADK files & online data for PS
3- For MM, I am thinking at 2LIS_03_BF since I am extracting MM: Material movements
     Since that extractor is delta capable but I am looking for full upload, could you pl provide steps how to work around that if possible?
Thanks.

Thanks for replying,
I have already reviewed that document, it is not helpful in my case since I can retrieve archive data using PBS index. I think that I have to develop a FM extractor capable to fetch online and archive data for appl. comp. PS.
Several fileds need to be accessed on tables MSEG, AFKO, AFVC, EBAN, ESLL, ESLH, t430...
I already have a FM module capable to fetch online data... I need to modify it so that it would fetch online and archive data...
I am looking here for the logic ... could you please experts paste a template code capable to fullfill that requirement or sent it under (blaise.pascal(at)ymail.com) I will really appreciate.
Edited by: Blaiso on May 12, 2011 7:27 PM

Similar Messages

  • Directory for archiving files with errors

    HI ,
    My Scenario is file to RFC Asynch.
    when ever the PI picks file from the source directory that file need to delete from the path and need to be archived to some other patch.
    And if the file has wrong data it need to archive to error archive folder.
    For this I used  Archive processing in filesender adapter and I also used directory for archiving files with errors(for wrong data files).
    Now the files are arching successfully in the given archive folder for successful messages. But the error files (wrong data which are throwing mapping error in MONI) are also archiving in the same archive folder.
    But these error files need to archive in error archive folder.
    How can I achieve this?
    Thanks in Adv..
    Vankadoath.

    you can use raja's logic ..........in addition in second receiver do a simple mapping UDF to delete the file from the SUCCESS folder. otherwise your file will be archived in both the folder success and errror
    see this code.
    DynamicConfiguration conf = (DynamicConfiguration) para.get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
                   DynamicConfigurationKey keyFileName1 = DynamicConfigurationKey
                   .create("http://sap.com/xi/XI/System/File", "FileName");
                         inputFileName = conf.get(keyFileName1);
      *File f1 = new File("/usr/sap/XX/"+keyFileName1);*
      *boolean success = f1.delete();*
      *if (!success){*
      *System.out.println("Deletion failed.");*
      *System.exit(0);*
      *}else{*
      *System.out.println("File deleted.");*

  • Need to Pass filename for archived file to FTP adapter using SynchRead

    Hi
    I am archiving the source file which i am reading using an FTP adapter, Operation- SynchRead.
    In my case as the source filename is dynamic( abc_<timestamp>.xml) hence before the SynchRead, I am using a FTP List adapter to get the filename.
    Currently,the archived file is getting name in pattern: encryptedToken_yyyymmdd_hhmmss.(e.g. wQ2c3w7Cj7Y6irWnRKqEu77_jD7kLtj6Uc0QCZIyjC4=_20121122_012453_0305)
    I need to pass the sourceFilename(which i am getting from FTPList adapter) for the archived file also.
    Thanks in advance for the help!
    Regards,
    Achal

    Hi Neeraj,
    While trying the above alternative, i am facing an issue when my source file is a .csv file. The file is getting recreated with the original filename and file content but without header.
    As per the requirement i need the original file to be recreated. The header of .csv file has the field names.
    Please let me know how should i configure my FTP adapter to read the header of the .csv file as well.
    Thanks,
    Achal

  • Junk name for archive file!

    Hi All,
    I have a bpel process with FTP adapter as the process initializer.
    The FTP adapter does the get operation. The read files are supposed to be archived. The FTP server is a unix machine.
    i am facing a problem while archiving! the file name of the archived files are all being assigned junk values - like
    --> 'LSoO7S4PZnqOCHSy2fovgg==_20101215_104859_0060'
    --> 'vcT89qBcHx3hEyHGHWS6ow==_20101215_122241_0017'
    It is suppose to look some thing like 'abcd_243.xml_20101215_104859_0060' or 'abcd_345.xml_20101215_122241_0017' (This is how it looks in case of file adapter
    This is an archived XML file.
    jca:operation -
    <jca:operation
    FileType="ascii"
    LogicalDirectory="rm_incoming_file"
    ActivationSpec="oracle.tip.adapter.ftp.inbound.FTPActivationSpec"
    LogicalArchiveDirectory="rm_success_file"
    DeleteFile="true"
    IncludeFiles="abcd_.*\.xml"
    PollingFrequency="10"
    MinimumAge="0"
    OpaqueSchema="false"
    UseRemoteArchive="true">
    </jca:operation>
    What could be the reason for this? how can i resolve this situation?
    Let me know if any more information is required.
    Thanks in advance,
    Roshan

    Hi Roshan
    You get these :
    --> 'LSoO7S4PZnqOCHSy2fovgg==_20101215_104859_0060'
    --> 'vcT89qBcHx3hEyHGHWS6ow==_20101215_122241_0017'
    This is because they are getting encrypted. You need to apply the latest 10.1.3.3.1 MLR # 19 in case you are on 10.1.3.3.0 patchset (or) 10.1.3.4 MLR # 9 in case you are on 10.1.3.4.0 patchset (or) 10.1.3.5.0 MLR # 2 in case you are on 10.1.3.5.0 patchset. Please refer otn.oracle.com to find their patch downloads.
    Regards
    A

  • Script to archive File Server Data

    Hi All,
      I'm looking for a script to remove the file server data by keeping last 2 years?
    As

    Hi As,
    In addition, To delete old files based on the datetime via Powershell, please refer to the script below, which will delete files created older than 2 years in the folder:
    $years=2
    $limit = (Get-Date).Addyears(-$years)
    $path = "e:\file"
    # Delete files older than the $limit.
    Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force -whatif
    The -whatif parameter will list the files which will be deleted by running the the cmdlet remove-item but do not execute the deletion.
    If there is anything else regarding this issue, please feel free to post back.
    If you have any feedback on our support, please click here.
    Best Regards,
    Anna Wang
    TechNet Community Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Tables for archived sales order data

    Hi,
    ABAP Gurus would like to know in which tables will I be able to see data of archived sales orders (it is not VBAK)
    Any help would be greatly appreciated.
    Thanks in advance.
    Mick

    Hi,
    If I am understanding correctly, your system has data archiving implemented.
    If that's the case then there is no table which will store archived sales order.
    For that you need to go to T.Code SARA.
    Choose object name SD_VBAK.
    Click on READ.
    This will show a screen giving a name of program and option to execute in background or foreground.
    This is how you can retrieve the info about archived Sales Order.
    Hope this helps
    Regards
    Nishant
    Message was edited by:
            Nishant Rustagi

  • How to make Preview the default reader for pdf files online?

    Hi everyone,
    Many product manufacturers put their product's broucher in pdf form online.  I've tried the screen shot capture method for saved pdf files but when the file won't even open online, how do I get to even capture a screen shot in the first place?  Is the problem in Safari 6?  Many thanks in advance.

    Hi Anthony,
    So, you're instincts were correct - it is Safari.....
    I use Firefox as my browser (I'm not a big fan of Safari or on Windows, of Explorer). When I clicked on your brochure in Firefox, it comes up with a little window asking me what I want to do with the file - and with the "Open With" Preview" button checked.
    When I do the same with Safari, it opens a new Safari window, displaying it online, but not downloading it.
    So, don't know if you want to download Firefox, but did want to let you know that's why it was working differently for me!
    Good luck with this - hope you are able to get it to work out for you!
    Cheers,
    GB

  • Hosting for pdf files online

    I'd like to be able to take the files I have converted in my adobe program and post them on my website so that they can be downloaded. Is there a good service for this? Does adobe provide this service?
    Thanks!

    You should be able to upload your converted PDF files to your website and link to those assets just like any other file. I'd check with your website hosting company for specific questions about file hosting. You're probably already paying for that service.
    -David

  • DART for archived PI (XI) data?

    We have archived tons of PI7.0 messages.
    I wonder if there is the DART to view the archived messages.
    However, there is no DART related  txns such as ftw0, etc.
    So can we use DART in PI7.0?  If so, how?
    Pints guaranteed.   Thanks!

    PI have its own archiving and deletion option. DART is not used in PI.
    http://help.sap.com/saphelp_nw04/helpdata/en/0e/80553b4d53273de10000000a114084/content.htm
    Regards,
    Prateek

  • Extractor for payroll reads data from cluster PCL2?

    Hello all,
    This is the line from help.sap in payroll extractor.
    "The extractor for payroll results reads data from payroll cluster PCL2, not from standard tables."
    Can someone explain to me what do we have to understand from that, and where in R/3 do we have to go and check for data.
    Thanks in advance.

    Anytime some of last information does not upload in delta to BW from payroll because payroll has a process to close it had to be executed (or something like that (sorry, i haven't enough funcional HR knoledge), it happened to me anytime).
    But, surely the user that you are using for extraction has no enought rights or authorizations.
    Look at these notes
    397208 **** very important!!!
    672514
    964569
    329961
    585682

  • Is it possible to read archive files outside of SAP?

    Hi Experts,
    I would like to read SAP archived data (*.ARCHIVE.ARCHIVING)  with non-SAP application. I already know, that data is automatically compressed (and maybe encrypted?).
    My question: Is there some possibility to read archive files outside of SAP? I suppose that would be needed to do some decompress first.
    Your help is appreciated. Thank you!
    Jan

    Hi Stefan,
    Thanks for your reply. I found some points with your hint. Some summary for others:
    460620 - Migrating archive files:
    "Archived data may have to be migrated. One solution is to perform an SLO service archive migration. You can also perform an archive migration for archive files using the Archive Development Kit (ADK)."
    153433 - Access to archived data from other logical systems
    "For data security reasons, the ADK checks whether an archive file was created in the same system and client where it is to be read. If the client, system or file key do no longer correspond to the meta data that were valid at the time of archiving, you can no longer always access the archive file, particularly for reloading. If only the system ID changes when a new installation due to a system copy is carried out, read accesses are still possible. No solution is provided in the standard system. The access to archive files from other logical systems must be taken into account and carried out in a migration project."
    Unfortunately, I still do not know if it is possible to read archived data outside of SAP.
    Best Regards,
    Jan

  • Cs6 pro prem install from dload exe and 7z on usb both in same folder. when run exe it gives error archive file needed

    new install cs6 production premium  .downloaded exe to a usb thet the vendor provided with the 7z file cos i dont have lots of data.  .With both in the same folder on the usb running exe rsults in error message asking for an archive file.  There are only 2 files in the download . what does it want?

    same error message from desktop looking for archive file
    files
    ProductionPremium CS6.7z  6,109,190  KB
    ProductionPremium_CS6_LS7      1,020  KB
    running win 8.1 on ssd and z97 mobo 4790 cpu gtx780

  • Selection in IP for flat file Load

    Hi Experts ,
    I Want to know can I have selections available for an IP which is created for Flat file as data source.
    your early response is highly appreciated.
    Regards
    Patil

    Hi,
       Yes You can. But at the data source level you have to tick the selection fields, so that at info pack level those fields will be available for selections.
    Regards
    Sankar

  • IPhoto file creation date inconsistencies during drag and drop

    I have noticed that if I drag and drop a photo from iPhoto to Finder, the file creation dates in Finder are inconsistent.
    (This question is related to drag and drop only and not File->Export, which always uses the export date and timestamp for the file creation date and thus does not suit my needs).
    TEST A -- If the EXIF DateTimeOriginated is 01/01/2013, and today's date is 03/03/2013, then:
    In some cases when I drag a file to Finder, the EXIF date is used as the file modification/creation date in Finder
    In some cases, today's date is used as the file modification/creation date in Finder
    In some cases, a date in between the EXIF date and today's date is used
    It appears that for case A1, these are files that do not have a modified copy.  That is, if you select the photo in iPhoto and then click "File" -> "Reveal In Finder", the "Modified File" choice will be greyed out.
    For cases A2 & A3, iPhoto has inexplicably decided to create modified versions of these files either today or sometime in the past.
    TEST B -- I have read that unexplained modifications are tied to the auto-rotate function in cameras, and it does seem to be the case when I performed the test below:
    Select a large group of landscape format photos (these would not have been auto-rotated), then drag and drop to Finder.  The file creation dates are set to the EXIF date
    Add some portrait photos to the group in (1).  Now the file creation date of ALL photos including the non auto-rotated photos are set to the current date
    The behaviour in B2 is clearly wrong, since the landscape photos should be the same as in B1.  This is bug #1.
    Furthermore, iPhoto appears to be inconsistent with when these modifications are made.  For example, I dragged & dropped an auto-rotated photo on 02/02/2013, then dragged & dropped it again today, then the file creation date in Finder (and also the date of the modified file in iPhoto, as shown in iPhoto File->Reveal In Finder->Modified File) can either the EXIF date (01/01/2013), the late of the last drag & drop (02/02/2013), or today's date (03/03/2013); there does not appear to be any rhyme or reason to this.  This is bug #2.
    In any case, saying "you should never use drag & drop in iPhoto" (as I have read in some other forum posts) isn't a solution, because Apple should either (a) support this function correctly or (b) remove it altogether.  Furthermore, I regularly burn photos to disk for others so having the file date and timestamps correctly set to the EXIF date helps keeping the photos sorted in the directory listings for multiple OS, so File->Export isn't a solution.

    File data is file data. Exif is photo data. A file is not a photo.  It's a container for a photo.
    When you export you're not exporting a file. You're exporting a Photo. The medium of export is a new file. That file is created at the time of export, so that's its creation date. The Photo within that file is dated by the Exif.
    There are apps that will modify the file date to match the Exif.
    The variation you're seeing is likely due to the changes in how the iPhoto library works over the past few versions. Drag and drop is handy, but is not a substitute for exporting, nor intended to be.

  • EXtractor for Capacity planning

    HI, I cant find any extractor for manufacturing capacity planning, data are stored in tables KBED + KBKO  on R3 side. Have You any advice
    Thanx

    Hi,
    Check with 2LIS_04_* datasources.
    Thanks
    Reddy

Maybe you are looking for

  • Problem with gettin text from a new window

    Hi I have been trying to for sometime to work out why i cant get this to work...I open a main window which opens a new window when a button is pressed. This window has a text field where data can be typed, it has a button called "Send" which then all

  • Unplanned Delivery Cost through INVOIC IDOC - Failing due to Cost Center

    Hello all, We are using INVOIC idoc for incoming invoice from the vendors, and are using E1EDK05-BETRG for getting the unplanned delivery costs.  However, we don't want to use a default cost center (OKB9) for the "Unplanned Delivery GL Account", inst

  • Handling exceptions in a PL/SQL Web Service

    Hi, I am building a JSF page which invokes some functions on a Pl/SQL web service. I am accessing this web service via a data control. Now, if my pl/sql procedure throws an exception, how can I propogate that exception to the UI, say display an alarm

  • Dimension values for Unknowns and Invalids

    I've created a mapping for loading a particular dimension table which includes a sequence operator for populating my surrogate key values. I need the first record in the table to act as my "UNKNOWN" values placeholder but cannot seem to figure out th

  • Auditing

    Hi, My question is that my client needs auditing information for insert or update or delete of records in a table. I know in oracle i can write triggers for each row and can get new and old values. But he is saying that instead of new and old values