COPADEMO_TRANSACTION.CSV

Hi there,
I have installed the business content of COPA and I found the CSV file on the application server. But when I load the CSV file, I get the message: 'Error 4 when loading external data'. When I look in the CSV I see a other order the when I look in the infosource. They have to get the same order. Doe someone have the exact CSV file?It is a lot of work to sort and order the whole CSV.

Yeah, the order in the csv file should be the same as in communication structure.
You need to re-order accordingly.
Thanks

Similar Messages

  • How can i update an existing item in sap using CSV file?

    Hi,
    i am trying to update an existing Item in SAP using a CSV file.
    in the message log i get an error message that the item already exists.
    what should i do in order to update the existing record?
    Thanks, Udi

    Hi..........
    I would sugest you to use Tab delimited file and choose proper option in order to update the itsm master in DTW......
    Regards,
    Rahul

  • 10g exporting to CSV using client_text_io is not working correctly.

    I have an odd issue which i could do with some help with. I run an function that exports to CSV based on an pre defined record group.
    This has been working fine for many months with various customers. Recently a new customer used it and they have 28k rows in his record group and the export is actually not exporting correctly.
    The record group has a record count of 28331.
    The CSV produced has only 3756 in my CSV file. These are the last 3756 records in the Record group so its as if its overwriting the data as it goes yet all smaller datasets work
    FUNCTION fun_export_csv (vgraphid NUMBER, p_filename VARCHAR2)RETURN BOOLEAN IS
      out_file                      client_text_io.file_type; 
         i                                             NUMBER;
      lv_line              VARCHAR2(5000);
    BEGIN
    rg:=populate_group('RG11_EXP');               
    synchronize;
    lv_line:= ('"GIN","Gin Date","PO Num","PO Required Date","Mat Num","Mat Description","Supplier Part No","On Time Delivery(Yes-1, No-0)"');
    client_text_io.put(out_file, lv_line);
    client_text_io.new_line(out_file,1); 
    For i in 1..get_group_row_count('RG11_EXP') Loop--this count is 28331
    lv_line:= ('"'||get_group_number_cell('RG11_EXP.col1', i )                 ||'"'|| ',' ||'"'||
         get_group_date_cell('RG11_EXP.grn_date', i )                 ||'"'|| ',' ||'"'||                                          
         get_group_number_cell('RG11_EXP.po', i )       ||'"'|| ',' ||'"'||
         get_group_date_cell('RG11_EXP.daterqd', i ) ||'"'|| ',' ||'"'||
         get_group_char_cell('RG11_EXP.item_no', i ) ||'"'|| ',' ||'"'||
         get_group_char_cell('RG11_EXP.desc', i )                    ||'"'|| ',' ||'"'||
         get_group_char_cell('RG11_EXP.part_no', i )               ||'"'|| ',' ||'"'||
         get_group_number_cell('RG11_EXP.ontime', i )                    ||'"');
    client_text_io.put(out_file, lv_line);
    client_text_io.new_line(out_file,1); 
    END LOOP;     
    client_text_io.FCLOSE(out_file);
    RETURN TRUE;          

    Hello,
    Try to insert a "synchronize" instruction from time to time:
    i  pls_integer := 1;
    Loop
      If mod(i, 500) = 0 Then
         synchronize;
      End if ;
      i := i + 1 ;
    End loop;
    ...<p>But keep in mind the the CLIENT_TEXT_IO generate a lot of network traffic, so it is better and faster to generate the file on the A.S., then after transfer it to the client machine.</p>
    Francois

  • Multiple CSV exports from the one button or pl/sql procedure?

    I need to have multiple csv exports from the one press of a button. The easiest way I found to do this is it to use javascript to popup three windows, each as a CSV link. This is a bit ugly though, and leaves the browser popup windows open when the file has been downloaded.
    I guess I could also make a solution based on branching, but I think that would be difficult to maintain and reeks of bad design (im not a fan of this spagetti GOTO style code!).
    I implemented Scott's custom CSV as found here: http://spendolini.blogspot.com/2006/04/custom-export-to-csv.html
    However I would like to know if its possible to download more than one file using this method. I could not work out how to do this .
    Has anyone got any ideas? Simply repeating the code puts the second table into the original csv file. Is there a way to 'reset' the htp writer or smoething?
    Any help greatly appreciated,
    Alex

    Sorry for the confusion - I guess I mean its easy in .NET because you can simply compress files together and then send 1 zip file down as the response. See http://www.developer.com/net/net/article.php/11087_3510026_2 for details.
    I guess I could ask how to do this in APEX - but it seems to me that my original wording addresses the concept at a much more abstract level. I may not find the best solution for my problem if I just asked 'how can I dynamically zip together three tables as seperate files and send them to the client?'. I also suspect that this method is not possible in APEX without custom packages. Please prove me wrong!
    I guess even if I could find some kind of javascript that didnt open a new window, but was a direct download to the CSV, that would be a good compromise. At the moment when you click on the link, three windows come up and stay blank until the files are ready for downloading. Then after the files have been downloaded the windows must be shut manually. Yes, I could use javascript to make the windows 1x1 pixel perhaps, and then shut them after a predetermined timeframe - but this is hardly an elegant solution!
    Thanks for your responses.

  • How Can I Export Our WIki in a CSV Format?

    Victims of our own success: what was originally created as internal content on our private OS X Server wiki now is being requested for presentation on an externally hosted web service. Management wants this content to be accessible to constituents on the outside in a searchable database on a completely different service not administered by us on the inside.
    The developer building the database has asked for this content to be exported in CSV for easy import into the new database: page title, tags (a.k.a. keywords, not to be confused with HTML formatting tags), and page content.
    Sadly, according to the Wiki Server Admin Guide (p. 73), the Wiki service does not appear to store page content in any kind of database, but rather in separate files. Thus I can't see a way to provide the requested data short of copying and pasting content from each individual wiki page on the OS X Server.
    Considering that we're talking about ~200 pages of content, I'd like to avoid a manual process of that kind. I've been through the official Wiki Server Admin Guide, but it offered no help for exporting content in the way I need.
    I'm fairly comfortable with AppleScript, less so with shell/bash scripting. Since I'm trying to save time, I'm not sure that's my answer on this scale. Debugging a script (for me) might take longer than a simple copy/paste of 200 items (maybe I'm wrong).
    Does anyone have a different way of looking at this I haven't thought of? Am I missing something?
    Thanks in advance,
    Axiom

    Unfortunately, exporting the wiki data into a format that databases can understand is not a trivial process. Pages are stored as a collection of folders that contain the wiki text, the HTML for display, the change history, the comments, and the attachments.
    In a default setup, the page folders are under /Library/Collaboration/Groups/{groupname}/wiki . All folders under Groups can only be viewed by using root access permissions.
    For example, for a wiki page accessed by:
    http://yourserver.com/groups/general/wiki/abc12/mywikipage.html
    Under /Library/Collaboration/Groups/general/wiki
    abc12.page - folder for wiki page
    !- attachments - folder for attachments
    !- images - folder for images
    ! page.html - html (probably a cache) for display
    ! page.plist - property list file with all notes on the
    ! revisions.db - SQLite database file that contains all the revisions.
    You would need a script that runs as the _teamserver user to read every .page directory, parse the .plist files, and generate a SQL statement to preserve all the text, markup, and quotations. If you program really well, you could search for the URLs in every .plist file and record the embedded attachments and images, and copy them to another location.
    For my first project, I parsed the .plist files just to get a map of the random page URL and the title. I also found out about "deleted" and "tombstoned" states, and elected to ignore those.
    If I get a moment I'll write about that next.

  • Loading 361000 records at a time from csv file

    Hi,
    One of my collegue loaded 361000 records from one file file , how is this possible as excel accepts 65536 records in one file
    and even in the infopackage the following are selected what does this mean
    Data Separator   ;
    Escape Sign      "
    Separator for Thousands   .
    Character Used for Decimal Point   ,
    Pls let me know

    hi Maya,
    it just possible, other than ms-excel, we have editor like Textpad that support more 65k rows (and windows Notepad), the file may be generated by program or edited outside in excel, or newer version of excel is used, ms-excel 2007 support more 1 million rows.
    e.g we have csv file
    customer;product;quantity;revenue
    a;x;"1.250,25";200
    b;y;"5.5";300
    data separator ;
    - char/delimeter used to separate field, e.g
    escape sign, e.g
    - "1.250,25";200 then quantity = 1.250,25
    separator for thousands = .
    - 1.250,25 means one thousand two hundred ...
    char used for decimal point
    - - 1.250<b>,</b>25
    check
    http://help.sap.com/saphelp_nw70/helpdata/en/80/1a6581e07211d2acb80000e829fbfe/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/c2/678e3bee3c9979e10000000a11402f/frameset.htm
    hope this helps.

  • Error while creating table from csv file

    I am getting an error while creating a table using 'Import Data' button for a csv file containing 22 columns and 8 rows. For primary key, I am using an existing column 'Line' and 'Not generated' options.
    ORA-20001: Excel load run ddl error: drop table "RESTORE" ORA-00942: table or view does not exist ORA-20001: Excel load run ddl error: create table "RESTORE" ( "LINE" NUMBER, "PHASE" VARCHAR2(30), "RDC_MEDIA_ID" VARCHAR2(30), "CLIENT_MEDIA_LABEL" VARCHAR2(30), "MEDIA_TYPE" VARCHAR2(30), "SIZE_GB" NUMBER, "RDC_IMG_HD_A" NUMBER, "START_TECH" VARCHAR2(30), "CREATE_DATE" VARCHAR2(30), "RDC_MEDIA_DEST" VARCHAR2(30), "POD" NUMBER, "TAPE" NUMBER, "ERRORS_YN" VA
    Any idea?

    I am getting an error while creating a table using 'Import Data' button for a csv file containing 22 columns and 8 rows. For primary key, I am using an existing column 'Line' and 'Not generated' options.
    ORA-20001: Excel load run ddl error: drop table "RESTORE" ORA-00942: table or view does not exist ORA-20001: Excel load run ddl error: create table "RESTORE" ( "LINE" NUMBER, "PHASE" VARCHAR2(30), "RDC_MEDIA_ID" VARCHAR2(30), "CLIENT_MEDIA_LABEL" VARCHAR2(30), "MEDIA_TYPE" VARCHAR2(30), "SIZE_GB" NUMBER, "RDC_IMG_HD_A" NUMBER, "START_TECH" VARCHAR2(30), "CREATE_DATE" VARCHAR2(30), "RDC_MEDIA_DEST" VARCHAR2(30), "POD" NUMBER, "TAPE" NUMBER, "ERRORS_YN" VA
    Any idea?

  • Error while refreshing a report using local csv file

    Hi,
    I'm using BI 4.1 SP02.
    While using Rich client, I've created a report with some merged queries, while one of the queries is a local CSV file - saved on AD in some server, and not on the repository inside BO.
    While trying to refresh the report with the Rich client, it all went great.
    Now, while using BI Launchpad java based app, I can't refresh the report - I get the following error:
    "An Internal error occurred while calling 'processDPCommandsEx' API. (Error: ERR_WIS_30270) (WIS 30270)"
    Should I be able to refresh a report without the Rich if it contains a local file (which is possible to EDIT only with rich) ?
    If so, then did someone ran into this error?
    Thank you,
    Or.

    First of all, thanks for both of the replies.
    Second,
    my problem is unlikely have to do something with permissions from one reasons -
    when the report is using XLS\XLSX on same folder(with same name prefix) - the report is running without any problem.
    Only problem is while refreshing without Rich while the source is network CSV file.
    Any suggestions?
    Thanks.

  • Bug in the copy cluster roles wizard - CSV reparse points cannot have names with spaces

    Hi All,
    I have identified a bug in the copy cluster roles wizard in 2012 R2, which is preventing me from migrating my clusters from 2008 R2 & 2012 to 2012 R2.
    The bug surfaces when you attempt to copy a VM role from a 2008 R2 or 2012 cluster that has a CSV reparse point name that contains spaces as follows:
    The role appears to copy across fine, but once you've migrated the CSV and brought it online followed by starting the VM, the VM configuration resource promptly fails along with the VM.
    The issue can be spotted inside the copy cluster roles wizard, where it presents itself by removing all text after the first space in the name to read "C:\ClusterStorage\CSV" rather than "C:\ClusterStorage\CSV
    Test Disk 3" as follows:
    The bug only appears for VMs on CSVs with a reparse point name containing spaces - all other scenarios succeed.
    After searching through the registry on the new cluster, you can clearly see that the VM role hasn't been correctly copied.
    It shows a value called SharedVolumeMappings with the data C:\ClusterStorage\CSV|<guid>|<integer>
    Whereas it should show the data C:\ClusterStorage\CSV Test Disk 3|<guid>|<integer>
    If you modify the data in the SharedVolumeMappings value in both
    0.Cluster\Resources\<VM guid> and Cluster\Resources\<VM guid> keys, the VM will start and work correctly.
    The problem for me is that I have many VMs on many different CSVs all with spaces in the names of the reparse point! Its not feasible for me to go through the registry for every copied VM on every node of the new cluster, as this would take a very long time
    and would be prone to errors that could cause further problems.
    Can anyone provide any help on this? A hotfix for the issue would be great, if anyone from Microsoft is reading :-)
    My deadline to get the VMs migrated to the new cluster is the 21st February, so I would really like to have a solution to the problem before then if at all possible.
    Many thanks in advance,
    Tom

    Hi Subhasish,
    Thanks for the reply, glad that its something that is reproducible!
    Do you have any idea of timescale? Or is it likely that I will have to modify the registry as above to get this working?
    If I am going to have to modify the registry, please can you let me know if the resolution I have tested above is safe and viable? Or are there other settings that I will need to change to make the cluster fully aware of the proper path to the CSVs for its
    VMs?
    Also, is there a known procedure for renaming the reparse point whilst VMs are running on it? This would allow me to negate the issue before copying the roles to the new cluster :-)
    Thanks again,
    Tom

  • Issue in conversion of output file from alv to csv file using GUI_DOWNLOAD

    hi,
    I am using GUI_DOWNLOAD to convert the internal table that am getting as the output of an alv into a csv(comma separated file) file.I am using the following code but its not generating a csv file instead it is generating a normal space delimited file.
    The code is as follows:
    data : lv_fname type string.
    lv_fname = 'C:\Users\pratyusha_tripathi\Desktop\status8.csv'. " Provide the file path & file name with CSV extention
    CALL FUNCTION 'GUI_DOWNLOAD'
    EXPORTING
    filename = lv_fname " File name including path, give CSV as extention of the file
    FILETYPE = 'DAT'
    WRITE_FIELD_SEPARATOR = '#' " Provide comma as separator
    tables
    data_tab = ITAB " Pass the Output internal table
    FIELDNAMES =
    EXCEPTIONS
    OTHERS = 22
    IF sy-subrc 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    Kindly let me know what changes can be made to make my code work.Also can GUI_download be used for batch processing and storing the output in application server?
    Thanks ,
    Pratyusha

    Hi,
    the short text description for WRITE_FIELD_SEPARATOR is "Separate Columns by Tabs in Case of ASCII Download", so why do you expect a comma?
    Try SAP_CONVERT_TO_CSV_FORMAT and then download.
    And no, GUI_DOWNLOAD is only for download via SAP GUI to a users computer.
    Best regards,
    Oliver

  • Creation of CSV file on client machine with data from forms

    Hi,
    My requirement is to generate a CSV file(or .XLS) on the client machine ie local drive with the details shown in a form.
    Oracle version -
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    PL/SQL Release 11.1.0.7.0 - Production
    CORE 11.1.0.7.0 Production
    TNS for Solaris: Version 11.1.0.7.0 - Production
    NLSRTL Version 11.1.0.7.0 - Production.
    I have searched the web for last couple of days and got to know that TEXT_IO is to be used to generate files on client machine. However, when I tried TEXT_IO, it was not able to generate the file on client rather it was able to generate on database server. After further browsing on this, there was a link which said that we need to use CLIENT_TEXT_IO to generate file on client side. For this, i was required to subclass the webutil.pll which i did and corrected the code to use CLIENT_TEXT_IO. The form was unable to compile and was not able to find "webutil_core" package.
    I am very confused with the disparity in the information available on the web as in what to use to generate a file on client side. If anyone has use it in the past, can he/she please detail what to use to get things sorted.
    Thanks,
    R

    Oracle version - Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    PL/SQL Release 11.1.0.7.0 - Production
    CORE 11.1.0.7.0 Production
    TNS for Solaris: Version 11.1.0.7.0 - Production
    NLSRTL Version 11.1.0.7.0 - Production. >
    So, what is your FORMS version. This is more important than your database version.
    My requirement is to generate a CSV file(or .XLS) on the client machine ie local drive with the details shown in a form. Depending on your Forms version, you would use TEXT_IO (if Forms 6i running in Client/Server mode) or WebUtil (if Forms 9i or higher).
    I'm going to guess that you are at least using Oracle Forms 9i since you stated that your attempt at using TEXT_IO produced a file on the DB server.
    There is more to using WebUtil than just attaching the WEBUTIL.PLL. If you had performed a simple search of the Forms Help System would have found numerous WebUtil topics to include: Introduction to WebUtil, Configuring WebUtil, Using WebUtil in Your Applications and the WebUtil User's Guide. If you have Oracle Forms release 10g or higher, WebUtil is included when you installed Forms, however, you do need to configure your installation to use WebUtil and you must download the Java COM Bridge (jacob.jar) from Source Forge. Take a look at the Configuring WebUtil Forms Help topic to find out which version of the Java COM Bridge you will need to download.
    After you have successfully configured WebUtil, take a look at the Using WebUtil in Your Applications topic to find out how to implement WebUtil in a form.
    Searching the Internet for answers is great, but don't forget the look at the Forms Help System because the majority of your questions can be answered there. :)
    Lastly, configuration of WebUtil is primarily done on your Application Server (AS). However, if you plan to perform preliminary runtime testing by running your Form from the Forms Builder, then you will configure your local runtime to support WebUtil as well as configure your AS. The steps are exactly the same. A common mis-step is to skip a step during the configuration because you don't think the step applies. Take a look at the Forms Help Runtime Setup Checklist topic for a list of step you need to complete in order to enable WebUtil.
    Hope this helps,
    Craig B-)
    If someone's response is helpful or correct, please mark it accordingly.

  • How to populate budget files file_budget.csv and file_acct_budget.csv

    Hi, we have implemened Financial Analytics 7.9.5 on EBS system. We have successfully loaded all the Data except Budget information. Recently I came to know that there is no Budget ETL in ORAR12..and need to use Universal Adapator for populating the budget information.
    According to this guideline I'm supposed to populate file_budget.csv and file_acct_budget.csv files manually and start the extract. Now I'm having trouble filling these files. These files exoect many IDs to be filled in like ORG_ID etc..and documentation says these IDs shoule be same as one in Warehouse..I don't understand how to fill this. Has anybody done this already..if so please guide us how to fill these files and appreciate if you can post a sample record for each of these files.
    Thanks.

    You need to go to the corresponding dimensions and see how the integration_id is populated for this dimension. For this, you will have to open the corresponding SDE mapping in Informatica and trace back the integration_id. The document says which table's integration_id to look for, for each of the ids in the fact. For eg:
    GL_ACCOUNT_ID
    GL Account identifier. Populate with integration_id from w_gl_account_d.
    PRODUCT_ID
    Product identifier. Populate with integration_id from w_product_d.
    COMPANY_ORG_ID
    Company Org identifier. Populate with integration_id from w_int_org_d where company_flg = Y.
    For example, if w_gl_account_d's integration_id is formed from gl_code_combinations.code_combination_id in EBS, you need to populate the code_combination_id (from EBS) of your budget record, for each row in the budget fact csv file.

  • Sender "Mail" adapter - CSV file attachment

    Hi there
    I'm looking for some help in configuring a sender mail adapter that receives ".csv" files. I did read some blogs that mention using the "PayloadSwapBean" module to read the mail attachment instead of the mail content. My problem is to now convert the ".csv" file into a message. Is there a module that I can use ( is it the "MessageTransfomBean" ) and how. Any help would be appreciated.
    Thanks
    Salil

    Hi Salil,
    If you want to send a mail with a body and attachments, the message sender HAS to provide an XI message with attachments. I doubt a CSV file does justice.
    As Renjith said you need to convert CSV to XmL.
    A short description about the Standard Modules:
    MessageTransformationBean is a standard module used to apply the XSLT mapping to the adapter module by using <i>Transform.class</i> ( This xslt mapping is done to create a mail package, Dont confuse with the actual mapping in your case this is NOT for converting csv to xml).
    Also this module can be used to change the name and type of payloads by using <i>Transform.contentType</i>, <i>Transform.contentDisposition</i>, <i>Transform.contentDescription</i>.
    PayloadSwapbean is a standard module for replacing payloads with other payloads (SWAP)
    If you want to give each attachment a certain name use Parameters, <i>swap.keyname</i> for name of the payload and <i>swap.keyvalue</i>.
    I Hope the use of standard modules is understood.

  • Read a CSV file and dynamically generate the insert

    I have a requirement where there are multiple csv's which needs to be exported to a sql table. So far, I am able to read the csv file and generate the insert statement dynamically for selected columns however, the insert statement when passed as a parameter
    to the $cmd.CommandText
    does not evaluate the values
    How to evaluate the string in powershell
    Import-Csv -Path $FileName.FullName | % {
    # Insert statement.
    $insert = "INSERT INTO $Tablename ($ReqColumns) Values ('"
    $valCols='';
    $DataCols='';
    $lists = $ReqColumns.split(",");
    foreach($l in $lists)
    $valCols= $valCols + '$($_.'+$l+')'','''
    #Generate the values statement
    $DataCols=($DataCols+$valCols+')').replace(",')","");
    $insertStr =@("INSERT INTO $Tablename ($ReqColumns) Values ('$($DataCols))")
    #The above statement generate the following insert statement
    #INSERT INTO TMP_APPLE_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )
    $cmd.CommandText = $insertStr #does not evaluate the values
    #If the same statement is passed as below then it execute successfully
    #$cmd.CommandText = "INSERT INTO TMP_APL_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )"
    #Execute Query
    $cmd.ExecuteNonQuery() | Out-Null
    jyeragi

    Hi Jyeragi,
    To convert the data to the SQL table format, please try this function out-sql:
    out-sql Powershell function - export pipeline contents to a new SQL Server table
    If I have any misunderstanding, please let me know.
    If you have any feedback on our support, please click here.
    Best Regards,
    Anna
    TechNet Community Support

  • Runtime error in KEFC Transaction while uploading CSV format file

    Hi Experts,
    This is regarding Runtime Error while executing the KEFC transaction in R3D.
    It is working successfully while uploading the file in Text format. But there is run time error while uploading in CSV format.
    It showing "The transfer was terminated.However,content errors occured" .
    Please provide the solution.
    Thanks in advance.
    Regards,
    Priya.

    Hi,
    To upload a CSV file it is used FM:
    CALL FUNCTION 'TEXT_CONVERT_CSV_TO_SAP'
    EXPORTING
    i_field_seperator = ';'
    I_LINE_HEADER =
    i_tab_raw_data = lt_file
    i_filename = p_filename
    TABLES
    i_tab_converted_data = p_table
    EXCEPTIONS
    conversion_failed = 1
    OTHERS = 2
    Please check what separetor this fm and your file are using. It should be the same.
    Regards,
    Fernando

Maybe you are looking for