Mass uploading of folder Structure?

Dear Expert,
As per client requirement, we needed feasibility of Upload of multiple documents with folder structure. Suppose we had folder structure with different files in folders available in CD, which we wanted to copy/create folder structure in Content server as available in CD. Is there any BAPI is available.
While Searching for this requirement I found bapi_document_create2 u2026.but I want to know with this can I directly copy folder structure available in CD to Content Server.
I wanted to know, as per requirement will it be possible to create folder structure from CD?
Thanks in advance..

Hi Sam,
as explained in your SAP customer message the BAPI_DOCUMENT_CREATE2 could be used for this requirement. If you want to hand over document structure data please fill table DOCUMENTSTRUCTURE. For further information on the BAPI functionalities please see SAP note 766277.
Best regards,
Christoph

Similar Messages

  • Mass upload of Org structure

    Hello,
    Is there any means to mass upload the Org structure.?
    Regards,
    Madhan

    Search on this forum for the document "transporting EBP systems" it's quite an old document, but still helpful.
    Also, take a look at RHMOVE00 and RHMOVE30, that might come in handy.
    Regards,
    Robin

  • Custom application to upload entire folder structures into KM at once

    Does anyone know of a way (or has an idea of how) to create an application that could be put in an iView to allow the user to select a folder on his/her PC to be uploaded onto KM, subfolders and all?  I have found sample apps that let you upload single files into KM, but none for multiple files or folder structures.
    Thank you

    Hi,
    Some ideas..
    Maybe you should try zipping the whole folder and uploading it. The upload program should unzip the file using the zip APIs from Java, and move the folder into an File Sytem or FSDB repository.
    If you are using the CM repository of type DB then you will have to look at the IResource and ICollection creation APIs to build the folder structure and transfer content.
    Regards
    Pran

  • Regarding Error handling in Mass Upload of Organizational Structures Using RHALTD00

    Hi all,
    I am using standard program RHALTD00 for mass uploading of Organizational objects into SAP system. But incase if our input file has any error records, how can we track such errors? Can you please share your inputs.
    Thanks,
    Cs

    Hi Sirisha,
    Report RHALTD00 imports the records from the file and creates a batch input
    session, which you can process either directly or using batch input
    processing:
    Batch input processing: When ever you will run the report in this way, error records handled in a session.
    Direct processing:
    Requirements to run this report is :-
    The dataset must contain all of the records of infotype 1000. Only then can
    other information be imported in any order that is required.
    The dataset must not contain records with workflow infotypes (infotype
    numbers from 1200 to 1299). Such records are not included in processing.
    Thanks,
    Sreeram

  • How can I upload multiple files or whole folder structures in one go to the Cloud?

    How can I upload multiple files or whole folder structures to the Cloud in one go? Uploading lots of files singularly does not help my workflow.
    All help is much appreciated.
    Paul.

    Hi,
    Uploading multiple files is browser specific.
    Internet explorer won't allow to select and upload multiple files on the cloud.
    If you want to upload multiple files then you have to login to Cretaive cloud using Firefox or Chrome web browser, then you can select multiple files in the Browse window to upload.
    You can't upload folders directly.
    Thanks,
    Baljeet

  • Upload or create folder structure

    Hello,
    I wonder if it is possible to create a folder structure on my
    My Files section. It is not possible to upload a folder and it
    seems not possible to create a folder structure either.
    This is an important function for me for organisation
    reasons.

    I agree. I would also like that feature. When we begin to use
    this program more frequently, we will have a ton of files all over
    the place.
    Thanks

  • How can I batch Save For Web and maintain a multilevel folder structure?

    First... I am using an old version of Photoshop.... 6.0 to be exact.  If your answer is to buy the newer versions, I respectfully say I know that's AN answer, but I'm looking for something that will work with my setup.  The version we have works just fine for the very few things we need it for, and we use new versions of InDesign and Acrobat for all else. 
    So here's the challenge:
    We have a weekly magazine featuring 1,200 plus photos of cars for sale, each saved in a folder named for the sales rep, week and day.  I must maintain these folder structure throughout the process, as the files names are duplicated by cameras and across reps.  So I cannot dump all batch converted pics into one single folder, yet I need to reduce the Quality setting which I find when completing a Save As in Photoshop.
    So I want to batch convert all the photos in multiple folders, some 3 folders deep, all at once....  reduce to 8 inches across, change to 72dpi, and reduce the quality slider to 5, which creates a nice 60-90K file.  BUT.... it wants to save all of them in a particular folder.  I need to just overwrite the original file with the new smaller one.... or save the new small version into the same folder the big one is in, but have it work with the "include all sub-folders" option in batch. 
    A sample File system .... 4513 Photos - Color <day of week> (3 folders) - <disk number><rep>Color (20 folders) - actual photos.
    After converting, we push up the final level of folder to our website and that organization allows us to use them there the way we need.  The uploading to the site and batch functions are done 3 to 4 times per week.... on different days.
    Is there a way to reduce the quality (compression) of the jpgs that does not require the creation of a new file in a particular folder, or is there a way to have the resulting file just drop back into the folders and subfolders during a batch function?  I just can't figure this out!
    As an alternative, what other programs might do this?

    You should be able to just use the html output of the Export for Web function and, if the video's converted correctly, just swap the file names at the appropriate places. Have you tried renaming VisualHub's output to what the Export for Web html wants, then placing it in the proper location?
    I'm not aware of a way to batch create the html files, but if I had to, I'd probably look at automator to see if I could figure out it's text manipulation options.

  • SharePoint List Filter to narrow a folder structure

    Hello,
    The company I work for has a document library containing folders for each of their hundreds of clients.  The current process now is to load all of the folders in alphabetical order in order so that the users can scroll through and find the correct client
    relatively quickly.  This takes longer than it should considering how many folders need to load.
    I was trying to utilize the SharePoint List Filter to pull the Title column so the users could filter based on the name and we wouldn't need to pull all of the data each time the library was opened, but it seems that the title and name columns don't hold
    their connection to the list.  Each time I attempt to set the connections and then use the filter, I get the "This filter is not connected" error.
    Is it possible to use this web part in this manner (or is there another web part that will work) or should I suggest moving to meta data instead of a folder structure if they want speed of use?
    Thank you in advance!

    Hi,
    Based on your description, my understanding is that you want to you want filter the document library by the title column so that the users could filter based on the name and don’t need to pull all of the data each time the library was opened.
    Refer to the following steps:
    Open your document library-> choose Export to Excel->in excel, click insert PivotTable,
    create a pivot table in Excel -> use  Report filters
    To filter the title column->save it and upload it to a document library
    Then create a new page in SharePoint  and add the Excel Web Access web part and configure it to display your workbook, (or you can add the Excel Web Access web part in your document library and configure it to display your workbook) Now
    you can filter the document library by the title column.
    Here is a link about how to Create a pivot table in Excel 2010:
    http://www.techonthenet.com/excel/pivottbls/create2010.php
    Here is a link about how to use Report filters:
    http://www.gcflearnfree.org/excel2010/20.5
    Besides, you can refer to the following blog:
    http://consulting.risualblogs.com/blog/2014/09/10/filtering-excel-webparts-in-sharepoint-using-query-string-parameters/
    Best Regards,
    Lisa Chen

  • Mass upload of BP into CRM

    Hello!
      I peform mass upload of BPs into CRM. I use FM BUPA_CREATE_FROM_DATA to create BP itself and BUPA_ADDRESS_ADD to save BP addresses and call these FMs in the same turn for each BP.
      So, if I do that for 1-2 BPs, it works perfectly. All BP are saved in CRM with there addresses on COMMIT. As soon as the number of uploaded BP increases(between two commits), for example, up to 100, there is an error happens:   E831(AM) in ADDR_MEMORY_SAVE.
      As well as I understood, it happens because of address saving peformed before BP saving on commit. I tried first to save BPs only (with commit) and then their addresses, and it worked on mass upload without any error.
      So, is there any idea about how to upload each BP with his address simultaneously without any error by mass upload?
        Thank you.

    You can upload millions of records using these function modules.Infact I am doing the same.
    Make sure that the data structures,tables which are passed to the Function modules are cleared for each time and FM's should be in the below mentioned order.
    BAPI_BUPA_CREATE_FROM_DATA
    BAPI_BUPA_ADDRESS_ADD
    BAPI_TRANSACTION_COMMIT.
    Thanks,
    Thirumala.

  • Automatic creation of KM folder structure from xml pattern

    Hi all,
    is it possible to create a KM folder structure automaticly, following the tree structure of an xml document?
    For example:
    an xml-document with following content:
    <item id=1 level=0>abc</item>
    <item id=2 level=1>aaa</item>
    <item id=3 level=1>bbb</item>
    <item id=4 level=2>ccc</item>
    <item id=5 level=0>def</item>
    I'd like to create a KM folder structure like this:
    - abc
    ---aaa
    ---bbb
    ccc
    - def
    Does anyone have any idea to implement this scenario in KM?
    Much obliged!
    Steffi

    Not currently - well at least not easily - there is always the programatic approach.
    I would suggest you can create an FSDB repository - this means you can then create your structure in the file system using standard desktop tools.
    After that you could do a mass copy back into the db based KM repository if you so wished.
    For your downstream systems you could always ICE the data across meaning you dont have to create everything again.
    Haydn

  • BI content with folder structure in portal Content Area

    HI Dear Friends,
    I have  a requirement in my current project where i am integrating the BI reports with SAP EP.Iviews are associated with BI Reports.In the portal on clicking workset it should display folders in the portal content area with Icons instead of in Detail navigation .On click of the displayed folders it should display its sub content with Icons.
    I tried in all ways but later i came to know there there is a SAP BI business package which would come up with predifined iviews associated with BI. By importing this business package we can dispaly the  BI content in the portal in a folder format .
    I am using BI and EP 7.0 versions.
    Ratnakar

    Ratnakar,
       Let me understand your requirements, correct me if I understood it wrong. You need to display BI reports in Folder structure that is similar when you log on through BEx analyzer or WAD. If this is correct, then you need to upload the BI Role which contain all the required Folder structures and it BI Queries or BI Queries url assigned to that role.Once you upload this role on to portal , you will see folder structure on PCA.
    Example
    BI Reports  - Folder 1
      I--> FI reports  -- Subfolder 1
      I            I--> AP reports 1
      I--> CO reports  -- Subfolder 2
                   I--> Cost Report 1         
    Hope it helps,
    Cheers,
    Balaji

  • Assign specific metadata for folder structures in the Content Server

    Assign specific metadata for folder structures in the Content Server
    Hi to all,
    I working with Oracle Content Server 10g and Desktop Integration Suite and I will like to know how can I restrict or enable some specific metadatas from the default metadata, for differents content folder. This is different from the function: Information Field Inherit Configuration.
    tnks!

    I am a bit unclear about your question:
    - do you ask about metadata assigned to folders, or
    - do you ask about metadata assigned to items in folders?
    Starting from #2, if you forget about folders, there is standard functionality that you can use to restrict, enable, ... metadata for an item. Read this chapter: http://docs.oracle.com/cd/E21764_01/doc.1111/e10978/c04_metadata.htm#sthref288 to get full details.
    I am not sure whether you can use anything of this functionality (profiles, option lists, etc.) when you check in a new folder. I doubt it, though. The logic of folders is slightly different - whilst e.g. profiles correspond to "content item", you don't find such a correspondence in folder's hierarchy (why a folder could not contain various content types, for instance?)
    Last question is, from where you want to use this functionality - even for items. In 10g, Desktop Integration Suite's functionality was rather limited (often it was supposed that a user will just 'throw' an item to a folder, and metadata will be inherited from folders). You might have to upgrade your DIS to 11g; it should work even with 10g Content Server, but make sure you verify it before mass upgrade. In 11g, DIS should offer profiles, etc. with full capabilities.

  • Transparent integration of a new drive into the folder structure of a file server?

    Hallo,
    I'm looking for solution to a problem, that seams to be simple, but obviously is not that easy to solve:
    I have to integrate a second external RAID system in to our MacOS X 10.6.8 Server based file sharing server that is accessed from Mac and Windows clients via AFP and SMB. I want to move one or two main folders from the old RAID to the new one, but this physical change has to be invisible to the users, as I don't want to confuse anyone.
    So my idea was to mover the Folders to the new drive, and create symbolic links at the old locations pointing to the new locations. This works very well for the Windows clients accessing the server via SMB, but does not work for the Macs on AFP.
    So I tried to move one folder to the new drive and mount it at the old location. Again this works fine for the SMB but does not work for AFP, the mount point does not show up in a directory listing via AFP.
    Do you have any other idea how I might integrate the new drive in transparent way into the old folder structure?
    Thanks for your input
    Florian
    PS: I could use SMB on the Macs but for some reason I, whenever I try to log into the server from a Mac via SMB, user name and password are accepted, but then the Mac client displays a message saying that I don't have the right to access the share. The same share works using AFP.

    Define 'invisible' please.
    Do you mean you need to be able to do this live, while users are on the system? or just that you can shut the machine down, reconfigure it, and bring it back up with the new configuration, even though the shares look the same to the users?
    I'm guessing the latter, but it's worth asking.
    Ultimately the problem lies in the way the file sharing systems deal with multi-volume sharepoints, and it's not easy. If you think about it, say you have a 1TB array handling the main sharepoint and you want to substitute one of the directories in that sharepoint with a new, empty 4TB array.
    When the user mounts the sharepoint they get a little status bar at the bottom of the window showing the available space... how much is that? Well, initially it would be however much of the 1TB volume is unused... except if they switch to the linked directory they now have 4TB available (or thereabouts)... so the amount of space available has changed even though, from the user's standpoint, they're still on the same share.
    It isn't valid for the OS to report 4 TB as free because that isn't the case unless you're in this specific directory.
    It also isn't valid for the OS to report 5TB free, even though there is that amount of space altogether.
    It also isn't valid for the OS to report 1TB free because you could upload 4TB of data if you put it in the right place.
    There are few solutions to this. Microsoft sort of addressed this with their DFS solution in Windows Server, but it's not trivial.
    Unfortunately you can't just blow it off and ignore the issue.
    Off hand there's only one thing I can think of that *might* work. if it doesn't then you're down to using multiple sharepoints on the server, with users mounting both disks simultaneously (which can be automated, for what it's worth).
    The one thing to try is to statically mount the second RAID at the appropriate location so that, as far as the OS is concerned, it looks like just another directory even though it's on a different disk. You'd do this by editing /etc/fstab and adding a line like
    UUID=AABBCCDD-79F7-33FF-BE85-41DFABE2E2BA /path/to/mount    hfs   rw
    to /etc/fstab (this file may not currently exist, so just create it as root).
    The first field is the UUID of the drive (which you can get via diskutil info)
    The second field is the path where you want this drive to appear in the filesystem - i.e. somewhere in the path of your sharepoint. There must be an existing (empty) directory at this path when the disk mounts.
    The third field identifies the disk as HFS
    The fourth field marks the disk as read-write
    Now when the system boots it should locate this disk and mount it on top of the existing RAID volume. If you cd to your sharepoint you should see the existing drive and if you cd from there into your mounted directory you should be looking at your new RAID.
    Now, this all works fine (or, at least, should do) from a standard OS standpoint. The big question is whether the AFP and SMB daemons honor and support this kind of setup... there's one way to try, of course...
    Now for testing purposes you could mount it at a dummy directory, just to see whether it's available to network clients. If it is then your next step would be to clone the data from the directory onto the new drive, then edit the fstab to mount the disk at the appropriate location.
    Note also that the entire volume will replace the directory you mount over - that means you can't replace two (or more) directories with one volume, but you can, of course, partition or setup your RAID into multiple volumes and mount each individual volume over a specific directory.

  • Mass Upload doesn't work

    Hi everybody,
    i am trying to upload user photos via mass upload iview. Therefore i create a folder (c:\photos). In this folder i have two files
    a)  mapping.properties.txt with the line: Anton=1011
    b) 1011.jpg
    I use the iView for Massupload from content provided by sap -> collaboration -> demo role -> iviews. I create a task with the following properties:
    Task name:           Mappupload
    Upload from Folder:           c:\photos
    Photo type:           NORMAL
    Apply checked conditions:      (no checked)
    Use Mapping:           (checked)
    After starting the task the photo was not updated and there was nothing in the application log!?
    What could be the problem? I can add the file manually but not through mass upload.
    Has anybody use a mass upload?
    regards,
    Seed
    Edited by: seed_mopo on Sep 26, 2008 11:04 AM

    Hii
    Pls Use these threads:
    https://www.sdn.sap.com/irj/sdn/thread?messageID=48369#48369
    https://forums.sdn.sap.com/click.jspa?searchID=16795891&messageID=4282098
    https://forums.sdn.sap.com/thread.jspa?messageID=9087#9087
    https://www.sdn.sap.com/irj/sdn/thread?threadID=257887
    Thanks:
    Hope it would be helpfull

  • Mass Upload of condition records in APO system.

    Hi All,
    I wanted to mass upload condition records in to the APO system for the transaction /N/SAPCND/AO11 from an external file.
    Is there any FM available?
    If FM is not available...which one is the best way to do this?
    LSMW?? or BDC???
    Please guide me..
    Thanks,
    Babu Kilari

    At least the key fields of the conditiontable
    (here 800) must be filled in structure ls_komg.
    ls_komg-vbeln = '1234567890'.  " document number
    ls_komg-posnr = '000010'.      " item number
    clear wt_komv.
    ls_komv-kappl = 'V '.        " Application V = Sales
    ls_komv-kschl = lc_kschl.    " Condition type
    ls_komv-waers = 'EUR'.       " Currency
    ls_komv-kmein = 'ST'.        " Unit of measurement
    ls_komv-kpein = '1'. 
    ls_komv-krech = 'M'.         " calculation type;
                                  "M = Quantity - monthy price
    ls_komv-kbetr = '1234.56'.   " new condition value
    append ls_komv to lt_komv.
    call function 'RV_CONDITION_COPY'
      exporting
        application              = 'V'
        condition_table          = '800'      " cond. table
        condition_type           = lc_kschl   " cond. type
        date_from                = '20061101' " valid on
        date_to                  = '20061130' " valid to
        enqueue                  = 'X'        " lock entry
        i_komk                   = ls_komk
        i_komp                   = ls_komp
        key_fields               = ls_komg    " key fields
        maintain_mode            = 'A'        " A= create
                                              " B= change,
                                              " C= display
                                              " D= create
      with reference
        no_authority_check       = 'X'
        keep_old_records         = 'X'
        overlap_confirmed        = 'X'
        no_db_update             = space
      importing
        e_komk                   = ls_komk
        e_komp                   = ls_komp
        new_record               = lv_new_record
      tables
        copy_records             = lt_komv
      exceptions

Maybe you are looking for