Open Hub File ignores comma separator

Hi friends,
I've got a problem while extracting a csv file with an open hub service.
In the system my user (tab default) has US decimal format, and I can see the values in the infocubes correctly.
But when I extract this values thru open hub service to a csv file, the comma separator and decimal point disappear.
I thought it was a problem of the configuration of my pc, so I modified the regional options selecting US format.
But, even doing this, when I open the csv file with excel the comma separator and decimal point doesn't exist.
I know there exist a transaction called something like "RSRVT" or similar that can definy the thousand separator, but I don't remember the exact name.
The field that I'm interested in the open hub file is defined as NUMC, length 17 and 3 decimal. I've tried to define with DEC format or CURR with similar result (totally failed).
Can anybody help me!... this is very urgent.
Thanks a lot.
Francis.

Hi
check out this link too...
http://help.sap.com/saphelp_nw2004s/helpdata/en/8e/dbe92341c84242be2c7d3917f1c197/frameset.htm
Re: Flat file data load
swetha

Similar Messages

  • Dynamic File name for Open Hub file

    Hi All,
    I wanted to create a .csv file using open hub destination. I wanted the name of the file to be dynamic based on month. for example in jan I wanted it as 01.Dump for January and 02.Dump for february.
    is is possible at all to do that.I wanted to save the file in the application server. There should not be any manual intervention the process.
    Thanks in advance.
    Regds
    Raghu

    Hi
    We have take month from the first record that we read.
    Our requirement is to take one months data and give it in a file with a file name having the corresponding month.
    In the code mentioned below MOC_CODE is the field that is used for this purpose.
    Hope it helps.
    Regards,
    Raghu
    code:
    #!/bin/sh
    Script to dump file.
    Parameters.
    $1 Division name to extract from the data file.
    It re-arranges the columns.
    It replaces values in strings.
    Parameters.
    DATA_FILE_NAME="data_file.csv"
    HEADER_FILE_NAME="header_file.csv"
    DIVISION_NAME="$1"
    Re-arrange header file columns.
    Transpose rows to columns comma separated.
    HEADER_ROW=`cat "$
    Re-arrange header columns.
    Re-direct the output to a temporary file.
    " | tr '\n' ',' | sed 's/,$//g'`
    echo "$" | awk 'BEGIN { FS = ","; OFS = "," } { print $2, $3, $1, $4, $5, $6, $7 }' > "$.$.tmp"
    Prepare awk program.
    Re-arrange data file columns.
    Replace UNI in 6th column with UNIT and Replace TON in 6th column with TONS.
    Filter division rows.
    AWK_PROGRAM="BEGIN { FS = \",\"; OFS = \",\"; TVAL = 0 } \$5 ~ /$/ { sub( \"UNI\", \"UNIT\", \$6 ); sub( \"TON\", \"TONS\", \$6 ); print \$2, \$3, \$1, \$4, \$5, \$6, \$7; TVAL = TVAL + \$7 } END { printf \"$ Total Value: %f\", TVAL 2> \"$$Re-direct the program to a temporary program file.Control.ctl\" }"
    echo "$" > "$.awk"
    Execute the data file formatting command.
    awk -f "$.awk" "$" > "$.$
    Get generation date.
    Get moc code from data file.
    .tmp"
    GENERATION_TIME=`date +"%Y%m%d_%H%M%S"`
    MOC_CODE=`head -1 $ | awk -F, '{ print $3 }'`
    Prepare dump file.
    DUMP_FILE_NAME="$_$_$_$"
    cat "$.$.tmp" "$.$.tmp" > "$
    Remove temporary files.
    FTP dump file to remote server.
    FTP.
    rm *.tmp
    REMOTE_SERVER="<replace with server ip address>"
    REMOTE_USER="xxxx"
    REMOTE_PWD="xxxx"
    ftp -n "$" << EOF
         quote USER "$"
         quote PASS "$"
         ascii
         put "$"
         bye
    EOF
    Exit.
    exit 0

  • Flat files data comma separated using SSIS.

    Hi,
    I have multiple flat files which come in comma separated columns. See example below :
    Customer Data
    CustID,FName,LName,Disease,Email,Phone
    12345,Xyz,Smit,Bronchitis, Asthma and fever,[email protected],80000000
    12346,Abc,Doe,fever Headache,[email protected],90000000
    12347,Klu,joe,Sugar, cough and fever,[email protected],12345678
    Please look at the ID's 12345 and 12347. The disease column has a internal comma space between. How do i remove the comma spaces in the disease column, so that it can be loaded from flat file to sql table using SSIS. ?
    Please help !
    Thanks

    Here is a full solution base on my post above (first option)
    1. create temp table (Give it a unique name):
    create table #T (Txt NVARCHAR(MAX))
    GO
    2. Insert all the data into temporary table. Each line in the text file, is a value for one column in a row in the table.
    -- I will jump to the table and use simple insert.
    -- If you have problem with step 1 then please inform us (this is simple bulk insert basically)
    insert #T (Txt) values
    ('1234435,Xyz,Stemit,Brfsdonchitis, Asthma and fever,[email protected],80000000'),
    ('12346,Agjdfjbc,Doge,fevhhhher Headsxdshhache,[email protected],90000000'),
    ('123447,Klu,joe,Sugar, cough and fever,[email protected],12345678')
    GO
    the result should be like this:
    Txt
    1234435,Xyz,Stemit,Brfsdonchitis, Asthma and fever,[email protected],80000000
    12346,Agjdfjbc,Doge,fevhhhher Headsxdshhache,[email protected],90000000
    123447,Klu,joe,Sugar, cough and fever,[email protected],12345678
    I use a SPLIT Function named Split_CLR_Fn. This is a CLR Split function that get input <string to split> and <string as delimiter,> and it return table with 2 columns ID, SplitData
    For example if you use: SELECT * from Split_CLR_Fn('text1,text2,text3,',') then you get result:
    ID SplitData
    1 Text1
    2 Text2
    3 Text3
    ** You can find in the internet several good functions, I HIGHLY RECOMMENDED NOT TO USE T-SQL FUNCTIONS but CLR FUNCTION. Check thi link to understand why:
    http://sqlperformance.com/2012/07/t-sql-queries/split-strings
    ** This is the best function that I know about and I use it, but I change the code a bit to return 2 columns and not just the SplitData as in this blog: http://sqlblog.com/blogs/adam_machanic/archive/2009/04/28/sqlclr-string-splitting-part-2-even-faster-even-more-scalable.aspx
    That's it :-) we are ready for the solution which is very simple
    Solution 1 (BAD solution but easy to write):
    select
    (select SplitData from Split_CLR_Fn(Txt,',') where ID = 1) CustID,
    (select SplitData from Split_CLR_Fn(Txt,',') where ID = 2) FName,
    (select SplitData from Split_CLR_Fn(Txt,',') where ID = 3) LName,
    STUFF((select ',' + SplitData from Split_CLR_Fn(Txt,',') where ID > 3 and ID < (select MAX(ID) from Split_CLR_Fn(Txt,',')) - 1 for XML path('')), 1 , 1,'') Disease,
    (select SplitData from Split_CLR_Fn(Txt,',') where ID = (select MAX(ID) from Split_CLR_Fn(Txt,',')) - 1) Email,
    (select SplitData from Split_CLR_Fn(Txt,',') where ID = (select MAX(ID) from Split_CLR_Fn(Txt,','))) Phone
    from #T
    GO
    Solution 2: better in this case since the format is constant (this is the solution I wrote about above)
    ;With MyCTE as (
    select
    Txt,
    SUBSTRING(Txt, 1, CHARINDEX(',', Txt, 1) - 1) as CustID
    , SUBSTRING(
    Txt
    ,CHARINDEX(',', Txt, 1) + 1 -- I start from the end of preview len
    , CHARINDEX(',', Txt, CHARINDEX(',', Txt, 1)+1)- CHARINDEX(',', Txt, 1) - 1
    ) as FName
    , SUBSTRING(
    Txt
    ,CHARINDEX(',', Txt, CHARINDEX(',', Txt, 1)+1)+1 -- I start from the end of preview len
    , CHARINDEX(',', Txt, CHARINDEX(',', Txt, CHARINDEX(',', Txt, 1)+1)+1) - CHARINDEX(',', Txt, CHARINDEX(',', Txt, 1)+1) - 1
    ) as LName
    , RIGHT(Txt, CHARINDEX(',', REVERSE(Txt), 1) - 1) as Phone
    , RIGHT(LEFT(Txt, Len(Txt) - Len(RIGHT(Txt, CHARINDEX(',', REVERSE(Txt), 1) - 1)) - 1), CHARINDEX(',', REVERSE(LEFT(Txt, Len(Txt) - Len(RIGHT(Txt, CHARINDEX(',', REVERSE(Txt), 1) - 1)) - 1)), 1) - 1) as Email
    from #T
    select CustID,FName,LName, Phone, Email, SUBSTRING(Txt, Len(CustID) + Len(FName) + Len(LName) + 4, Len(Txt) - Len(Email) - LEN(Phone) - Len(CustID) - Len(FName) - Len(LName) - 5) as Disease
    from MyCTE
    I hope that this is useful :-)
      Ronen Ariely
     [Personal Site]    [Blog]    [Facebook]

  • How to append lines to an Open Hub File

    Hi Colleagues,
    I have to write data to an open hub destination file.
    Normaly file overwrite mode is used.
    But I need to append lines from multiple DTPs to one single file.
    So far, Open Hub Destination allows to add multiple DTPs, but how to configure BI, that all lines of every DTP are appanded to the file, and not only the last DTP wins.
    Thanks and Regards,
    Wolfgang

    No answer.

  • Can you open CSV-files with commas?

    It seems to me that we have something of a regression in Numbers 2.0 (part of iWork '09) - it (like Excel before it) no longer opens regular CSV-files.
    When I use the new Numbers to try to make a CSV-file, it'll actually use semicolons instead of commas, and any of my old CSV-files (like this one: http://d.ooh.dk/misc/postnumre.csv ) (with commas) will load all the values (and the commas) in a single column.
    I wonder if it's just me, or perhaps only a problem when using European notation (with comma as the decimal separator)?

    At last, Apple adopted the same behavior than Bento 1.
    When the decimal separator is the period, it works with standard CSV files.
    When the decimal separator is comma, it works with CSV using the semi-colon as item delimiters.
    To open your old CSV, set temporarily your system to a region whose decimal separator is period. Given that, Numbers will open them flawlessly.
    The ability to chose the separator (as it is now in Bento 2) would have been fine .
    Yvan KOENIG (from FRANCE dimanche 11 janvier 2009 20:30:03)

  • Download internal table as text file with comma separation

    hi all
    I wanted text file separated by comma. I used the CSV function module, but the result is separeted by semicolon,instead i need comma.
    Kindly suggest some solution.
    Thanks
    Subha

    use this fm to convert to csv file
    CALL FUNCTION 'SAP_CONVERT_TO_TEX_FORMAT'
        EXPORTING
          I_FIELD_SEPERATOR    = ','
        TABLES
          I_TAB_SAP_DATA       = ITAB_FINAL
        CHANGING
          I_TAB_CONVERTED_DATA = ITAB_OUTPUT
        EXCEPTIONS
          CONVERSION_FAILED    = 1
          OTHERS               = 2.
      IF SY-SUBRC <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
    itab_output is of type ITAB_OUTPUT TYPE TRUXS_T_TEXT_DATA,
    and then download using gui_download
    CALL FUNCTION 'GUI_DOWNLOAD'
          EXPORTING
            FILENAME                = W_FILENAME
            FILETYPE                = 'ASC'
          TABLES
            DATA_TAB                = ITAB_OUTPUT
          EXCEPTIONS
            FILE_WRITE_ERROR        = 1
            NO_BATCH                = 2
            GUI_REFUSE_FILETRANSFER = 3
            INVALID_TYPE            = 4
            NO_AUTHORITY            = 5
            UNKNOWN_ERROR           = 6
            HEADER_NOT_ALLOWED      = 7
            SEPARATOR_NOT_ALLOWED   = 8
            FILESIZE_NOT_ALLOWED    = 9
            HEADER_TOO_LONG         = 10
            DP_ERROR_CREATE         = 11
            DP_ERROR_SEND           = 12
            DP_ERROR_WRITE          = 13
            UNKNOWN_DP_ERROR        = 14
            ACCESS_DENIED           = 15
            DP_OUT_OF_MEMORY        = 16
            DISK_FULL               = 17
            DP_TIMEOUT              = 18
            FILE_NOT_FOUND          = 19
            DATAPROVIDER_EXCEPTION  = 20
            CONTROL_FLUSH_ERROR     = 21
            OTHERS                  = 22.

  • Uploading file with comma separator

    Hi firends,
    I have a text file in the format below.
    "abc","dedffrt","asd"
    The value of field is enclosed in double quotes and each values are separated by comma.
    I have tried with the function modules available for upload but of no use.
    Im aware of the methods uploading and splitting ...
    But i want to know is there any other function modules available to upload the file of this type.
    Keshav

    Hi KSD,
    Try this way.
    REPORT ztest_notepad.
    DATA: BEGIN OF it_t001 OCCURS 0,
            bukrs TYPE t001-bukrs,
            butxt TYPE t001-butxt,
          END OF it_t001.
    DATA: BEGIN OF it_file OCCURS 0,
            data TYPE char255,
          END OF it_file.
    START-OF-SELECTION.
      CALL FUNCTION 'GUI_UPLOAD'
        EXPORTING
          filename = 'C:\hor_file.txt'
          filetype = 'ASC'
        TABLES
          data_tab = it_file.
      REPLACE ALL OCCURRENCES OF '"' IN TABLE it_file WITH space.
      LOOP AT it_file.
        SPLIT it_file-data AT ',' INTO it_t001-bukrs it_t001-butxt.
        APPEND it_t001.
        CLEAR it_t001.
      ENDLOOP.
      LOOP AT it_t001.
        WRITE:/ it_t001-bukrs, it_t001-butxt.
      ENDLOOP.
    <li>Text file
    "CGH","CGH Hospital"
    "KKH","KKH hospital"
    "SGH","SGH Hospital"
    Thanks
    Venkat.O

  • Open hub with tab separator

    Hello,
    I've an open hub to export to flat file. Is it possible to use a tab in the separator parameter?
    I'd like to have the exported data in diferent cells in my excel sheet. It should be enough with comma but all data appear in the same cell in the CSV.
    Can anybody help me?
    Thanks in advance.
    Regards.

    Hi,
    While exporting,  On the Destination tab page, select the required destination and in the separator option give ( ; ) instead of ( , )
    try this .I hope you will get your columns in diffrent cells in the excel sheet.
    please refere to the link also.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/501f0425-350f-2d10-bfba-a2280f288c59?QuickLink=index&overridelayout=true
    Regards,
    Sidhartha

  • When I import a text file(comma separated )into a numbers spread sheet all the data goes into one column. Why does the text not go into separate columns based on the commas.

    When I import a text file(comma separated) into a numbers spreadsheet all the data goes into one column instead of individual columns based on the comma separators.  Excel allows you to do this during the import..  Is there a way to accomplish this in numbers without opening it in Excel and the importing into Numbers.

    Your user info says iPad. This is the OS X Numbers forum. Assuming you are using OS X… Be sure the file is named with a .csv suffix.
    (I don't have an iPad, so I don't know the iOS answer.)

  • Issue with parsing comma separated file

    Hi,
    We are having an issue while parsing a comma separated file with "eol" as delimiter for last field. For last line in file we get either  "eof" or "eol" as delimiter for last field. Is there any way we can handle both these scenarios using singe file adater? Example as below
    File1:
    a,b,c,d,e,f,g eol
    a,b,c,d,e,f,g eol
    a,b,c,d,e,f,g eol
    a,b,c,d,e,f,g eol
    a,b,c,d,e,f,g eol
    File2:
    a,b,c,d,e,f,g eol
    a,b,c,d,e,f,g eol
    a,b,c,d,e,f,g eol
    a,b,c,d,e,f,g eol
    a,b,c,d,e,f,g eof
    Thanks

    Write your own tokenizer. I think StringTokenizer ignores those values if they're empty since there's nothing to tokenize it into. It's a pretty simple class, I say just write your own.

  • Comma separated Excel file FCC?

    Hi Experts,
    I have an excel file like below (I have just mentioned the row n column to give an idea):
    first name(A1), last name(B1), street (C1) city(D1), state(E1)
    John (A2), Smith(B2), MacArthur(C2), Dallas(D2), TX(E2)
    Mike(A3), Dale(B3), Main St(C3), Austin(D3), TX(E3)
    Kevin(A4), Costner(B4), 2nd Steert(C4), Houston(D4), TX(E4)
    Can I tranform this into xml out-of-the box using XI 3.0 file adapter file content conversion? If I have to use the module, is there as stadard module available or should have to build from scratch?
    If I open the excel file in notepad, it opens as comma separated text file
    first name,last name, street,city,state
    John,Smith,MacArthur,Dallas,TX
    Mike,Dale,Main St,Austin,TX
    Kevin,Costner,2nd Steert,Houston,TX
    How do I do FCC for the above CSV file? I mean I have one header with the field names and remaings rows as record sets.

    > I have an excel file like below (I have just mentioned the row n column to give an idea):
    >
    PI Standard Adapter will not support Execl files.,there is no standard module available, you have to develop adapter module to process EXCEL files.
    >
    > Can I tranform this into xml out-of-the box using XI 3.0 file adapter file content conversion? If I have to use the module, is there as stadard module available or should have to build from scratch?
    >  If I open the excel file in notepad, it opens as comma separated text file
    >
    If you have comma separated values in text file then you can use file content conversion to convert CSV file in to XML,standard adapter will support this.
    > How do I do FCC for the above CSV file? I mean I have one header with the field names and remaings rows as record sets.
    search in sdn , many blogs available on the same  ,converting CSV to XML it is very easy.
    Regards,
    Raj

  • Open Hub: How-to doc "How to Extract data with Open Hub to a Logical File"

    Hi all,
    We are using open hub to download transaction files from infocubes to application server, and would like to have filename which is dynamic based period and year, i.e. period and year of the transaction data to be downloaded. 
    I understand we could use logical file for this purpose.  However we are not sure how to have the period and year to be dynamically derived in filename.
    I have read in sdn a number of posted messages on a similar topic and many have suggested a 'How-to' paper titled "How to Extract data with Open Hub to a Logical Filename".  However i could not seem to be able to get document from the link given. 
    Just wonder if anyone has the correct or latest link to the document, or would appreciate if you could share the document with all in sdn if you have a copy.
    Many thanks and best regards,
    Victoria

    Hi,
    After creating open hub press F1 in Application server file name text box from the help window there u Click on Maintain 'Client independent file names and file paths'  then u will be taken to the Implementation guide screen > click on Cross client maintanance of file name > create a logical file path by clicking on new entiries > after creating logical file path now go to Logical file name definition there give your Logical file , name , physical file (ur file name followed by month or year what ever is applicable (press f1 for more info)) , data format (ASC) , application area (BW) and logical path (choose from F4 selection which u have created first), now goto Assignment of  physical path to logical path > give syntax group >physical path is the path u gave at logical file name definition.
    however we have created a logical path file name to identify the file by sys date but ur requirement seems to be of dynamic date of tranaction data...may u can achieve this by creating a variable. U can see the help from F1 that would be of much help to u. All the above steps i have explained will help u create a dynamic logical file.
    hope this helps u to some extent.
    Regards

  • How to delete the Generated files from application server(open hub)?

    hi experts,
    when i try to execute process chain the DTP it is giving below dump. Exception CX_RSBK_REQUEST_LOCKED logged.
    when i execute the DTP manually and trying to delete the previous request, it is giving for dump ITAB_DUPLICATE_KEY.
    so to delete the generated file from application server, how to delete it for specific dates?
    Information on where terminated
    Termination occurred in the ABAP program "GPD6S3OE0BCVGC6L9DBNVYQARZM" - in
    "START_ROUTINE".
    The main program was "RSBATCH_EXECUTE_PROZESS ".
    In the source code you have the termination point in line 2874
    of the (Include) program "GPD6S3OE0BCVGC6L9DBNVYQARZM".
    The program "GPD6S3OE0BCVGC6L9DBNVYQARZM" was started as a background job.
    and when i check the dump it is point out at below code
    " Populate the lookup table for 0STOR_LOC
    SELECT * from /BI0/TSTOR_LOC
    into CORRESPONDING FIELDS OF table L_0STOR_LOC_TEXT
    FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE
    STOR_LOC = SOURCE_PACKAGE-STOR_LOC.
    but the programme is syntactically correct only.
    how to rectify the issue.
    regards
    venuscm
    Edited by: venugopal vadlamudi on Sep 28, 2010 1:59 PM

    hi experts,
    We have written start routine to get the storage location text and sending to File located at Application server through OPEN HUB.
    here is the code written in the Transformations
    In the global section
    Text for 0STOR_LOC
        DATA: l_0stor_loc_text TYPE HASHED TABLE OF /bi0/tstor_loc
              WITH UNIQUE KEY stor_loc.
        DATA: l_0stor_loc_text_wa TYPE /bi0/tstor_loc.
    and in the code to get the text
    " Populate the lookup table for 0STOR_LOC
        *SELECT * from /BI0/TSTOR_LOC*
          into CORRESPONDING FIELDS OF table L_0STOR_LOC_TEXT
          FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE
                  STOR_LOC = SOURCE_PACKAGE-STOR_LOC.
    im sure there is problem with the Routine only. i think i need to change the code if so please provide me the modified one.
    thanks
    venuscm
    Edited by: venugopal vadlamudi on Sep 29, 2010 9:37 AM

  • Open hub error when generating file in application server

    Hi, everyone.
    I'm trying to execute an open hub destination that save the result as a file in the application server.
    The issue is: in production environment we have two application servers, XYZ is the database server, and A01 is the application server. When I direct the open hub to save file in A01 all is working fine. But when I change to save to XYZ I´m getting the following error:
    >>> Exception in Substep Start Update...
    Message detail: Could not open file "path and file" on application server
    Message no. RSBO214
    When I use transaction AL11, I can see the file there in XYZ filesystem (with data and time correspondent to execution), but I can´t view the content and size looks like be zero.
    Possible causes I already checked: authorization, disk space, SM21 logs.
    We are in SAP BW 7.31 support package 6.
    Any idea what could be the issue or where to look better?
    Thanks and regards.
    Henrique Teodoro

    Hi, there.
    Posting the resolution for this issue.
    SAP support give directions that solved the problem. No matter in which server (XYZ or A01) I logon or start a process chains, the DTP job always runs in A01 server, and it causes an error since the directory doesn´t exist in server XYZ.
    This occurs because DTP settings for background job was left blank. I follows these steps to solve the problem:
    - open DTP
    - go to "Settings for Batch Manager"
    - in "Server/Host/Group on Which Additional Processes Should Run" I picked the desired server
    - save
    After that, no matter from where I start the open hub extraction, it always runs in specified server and saves the file accordingly.
    Regards.
    Henrique Teodoro

  • Exporting Metadata (caption information) from JPEGS to a comma separated value (CSV) file

    Here is my dilemma. I am an archivist at an arts organization and we are in the process of digitizing many of our materials to post them on the web and make them available to internet users. One of the principle components of our collection is a large trove of photographs. We have been in the process of digitizing these images and embedding metadata (in the Caption/Description, Author/Photographer and Copyright fields) via PhotoShops File Info command.
    Now I am at a crossroads. We need to extract this metadata and transfer it into a comma separated value form, like an Excel spreadsheet or a FileMakerPro database. I have been told that it is not possible to do this through PhotoShop, that I must run a script through Acrobat or Bridge. I have no clue how to do this. I have been directed to a couple of links.
    First I was directed to this (now dead) link: http://www.barredrocksoftware.com/products.html
    The BSExportMetadata script allegedly exports the metadata from files selected in Adobe's Bridge into a comma separated value (CSV) file suitable for import into Excel, Access and most database programs. It installs as a Bridge menu item making it simple to use. The the Export Metadata script provides you with an easy to use wizard allowing you to select associated information about a set of images that you can then export. This script requires Creative Suite 2 (CS2). This script sounds like it does exactly what I want to do, but unfortunately, it no longer exists.
    Then I found this:
    Arnold Dubin, "Script to Export and Import Keywords and Metadata" #13, 8 Aug 2005 7:23 am
    I tried this procedure, but nothing seemed to happen. I also tried to copy the script into the JAVASCRIPT action option in Acrobat, but I received a message that the script had an error. It also seems to me that this script does not set up a dumping point, that is, a file into which this information will be exported to.
    I am a novice, not a code writer or a programmer/developer. I need a step-by-step explanation of how to implement this filtering of information. We have about 2000 jpeg and tiff files, so I would rather not go through each file and copy and paste this information elsewhere. I need to find out how to create a batch process that will do this procedure for me. Can anyone help?

    Hello -
    Is anyone aware of a tool that will do the above that is available for mac? Everything I've found so far seems to be PC only.
    Any help is appreciated, thanks!

Maybe you are looking for

  • Is there a way for a site owner to copy a custom calendar that resides on a site on one site collection to a site on another site collection?

    A user contacted me about copying and moving information from an old site that is being retired to a new site. The most important info is the department calendar. I don't see a way to use the "Manage content and structure" functionality to copy or mo

  • Mail backup possible with Time Machine?

    I've done enough reading here to get that Time Machine does not back up the mail portions of the /var directory. As a matter of principle I configured mail services to use an alternate store location located at the root level of my server boot volume

  • Keynote - Share to Mobile Me Gallery

    Hi - I have a Keynote presentation I want to share to mobile me without losing the quality of my images. I've attempted: 1) Keynote - export to - quicktime - full quality (1024 x 768) Launch Quicktime - When I do this quicktime v 10.0 won't allow me

  • Trackpad becomes unusable after about an hour

    II've recently been having a problem with the built in trackpad on MacBook Pro. Started one day randomly with the trackpad lagging in the middle of the pad. Chatted to a apple worker and started up in safe mode and did a PRAM reset, both helped but d

  • Oracle license for VMWare server

    A vendor is telling me I can not put their Oracle database on a VMWare server because it is only licensed for one CPU. I tried to explain I could set the VMWare session to only use one CPU but they said Oracle will not allow that. They have already p