Wrong import date

When I import pictures into iphoto 8, the date shows Jan 5, 2007 on all the pictures, even the current pictures. I know the camera date is accurate. Does anyone know why this is and how to correct it?

i'm searching for help for the same problem, but I had this occure already with earlier iPhoto versions. the first time arround 2004. i don't know what version that was, but i always had the latest one installed. over the years now this problem keeps occuring and i have batches of hundreds of pictures with the wrong dates. especially sad when those are pix of your kids. the first year of my daughter is all dated 04/30 a year before she was born. i would love to get the real dates back!
this always occured after big changes, like new computer, upgrade, having to move library. nothing that should cause any of this. i never went into the library file and messed around in it. these photo libraries should be able to move savely with us for the rest of our lifes.

Similar Messages

  • Reinstalled and Imported Data with Migration Assistant, What went wrong?

    My MacBook HD recently went boom! After replacing it, I reinstalled the OS with the original DVD discs, a few steps later, I was offered to import data from an older computer so I plugged-in my FireWire portable disk with a CarbonCopy image of my old disk and proceeded with the import process.
    After the Assistant finished, only a minor import error with Missing Sync was displayed. When the OS was completely installed, I tried opening some Office apps and Dreamweaver but none of them would open. So I tried to reinstall them but then I discovered that after the disk image from the DMG files was mounted, I was unable to open its associated finder window in order to copy or install the apps.
    Does anyone knows why is this?
    Also, on both the FireWire Disk and the MacBook fresh install, there are a few files and folders named Desktop-#. Why are there so many?

    The FireWire disk is a backup of the original HD that went bust on my MacBook. I don't know for sure the exact version of the OS (10.4.10 maybe) but it was Tiger for sure. My Macbook is a Core Duo 13.3-inch model.
    The HD inside my MacBook now, is the original Tiger version that shipped with it but I'm in the process of downloading and installing all issued updates. I re-erased all the contents again, so now my laptop is fresh again.
    Any guiding-light as to how to transfer the info on the FireWire back to the laptop?
    BTW I can boot my machine from the FireWire disk and its works without problems, except for the office apps that don't launch.

  • Wrong sort date when importing .mbox

    hi, i hve just converted my .pst files to .mbox, after importing my inbox to entourage and sorting these according to the "received" date all emails only sort according to todays import date and time only....(the email still shows the original sent/received date)
    how can the emails be sorted according their correct date and NOT according to import time? help appreciated.

    Why is knowing that important during the edit?
    Yes, FCP only shows the date imported" date metadata.  Future updates of FCP X I assume will add more metadata support, such as Creation Date and Modification Date, since metadata is vital to FCP X and there are some metadata handling features that are on the way. 

  • Beginner trying to import data from GL_interface to GL

    Hello I’m Beginner with Oracle GL and I ‘m not able to do an import from GL_Interface, I have putted my data into GL_interface but I think that something is wrong with it Becose when I try to import data
    with Journals->Import->run oracle answer me that GL_interface is empty!
    I think that maybe my insert is not correct and that's why oracle don't want to use it... can someone help me?
    ----> I have put the data that way:
    insert into gl_interface (status,set_of_books_id,accounting_date, currency_code,date_created,
    created_by, actual_flag, user_je_category_name, user_je_source_name, segment1,segment2,segment3,
    entered_dr, entered_cr,transaction_date,reference1 )
    values ('NEW', '1609', sysdate, 'FRF', sysdate,1008009, 'A', 'xx jab payroll', 'xx jab payroll', '01','002','4004',111.11,0,
    sysdate,'xx_finance_jab_demo');
    insert into gl_interface (status,set_of_books_id,accounting_date, currency_code,date_created,
    created_by, actual_flag, user_je_category_name, user_je_source_name, segment1,segment2,segment3,
    entered_dr, entered_cr,transaction_date,reference1 )
    values ('NEW', '1609', sysdate, 'FRF', sysdate,1008009, 'A', 'xx jab payroll', 'xx jab payroll', '01','002','1005',0,111.11,
    sysdate,'xx_finance_jab_demo');
    ------------> Oracle send me that message:
    General Ledger: Version : 11.5.0 - Development
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    GLLEZL module: Journal Import
    Current system time is 14-MAR-2007 15:39:25
    Running in Debug Mode
    gllsob() 14-MAR-2007 15:39:25sob_id = 124
    sob_name = Vision France
    coa_id = 50569
    num_segments = 6
    delim = '.'
    segments =
    SEGMENT1
    SEGMENT2
    SEGMENT3
    SEGMENT4
    SEGMENT5
    SEGMENT6
    index segment is SEGMENT2
    balancing segment is SEGMENT1
    currency = EUR
    sus_flag = Y
    ic_flag = Y
    latest_opened_encumbrance_year = 2006
    pd_type = Month
    << gllsob() 14-MAR-2007 15:39:25
    gllsys() 14-MAR-2007 15:39:25fnd_user_id = 1008009
    fnd_user_name = JAB-DEVELOPPEUR
    fnd_login_id = 2675718
    con_request_id = 2918896
    sus_on = 0
    from_date =
    to_date =
    create_summary = 0
    archive = 0
    num_rec = 1000
    num_flex = 2500
    run_id = 55578
    << gllsys() 14-MAR-2007 15:39:25
    SHRD0108: Retrieved 51 records from fnd_currencies
    gllcsa() 14-MAR-2007 15:39:25<< gllcsa() 14-MAR-2007 15:39:25
    gllcnt() 14-MAR-2007 15:39:25SHRD0118: Updated 1 record(s) in table: gl_interface_control
    source name = xx jab payroll
    group id = -1
    LEZL0001: Found 1 sources to process.
    glluch() 14-MAR-2007 15:39:25<< glluch() 14-MAR-2007 15:39:25
    gl_import_hook_pkg.pre_module_hook() 14-MAR-2007 15:39:26<< gl_import_hook_pkg.pre_module_hook() 14-MAR-2007 15:39:26
    glusbe() 14-MAR-2007 15:39:26<< glusbe() 14-MAR-2007 15:39:26
    << gllcnt() 14-MAR-2007 15:39:26
    gllpst() 14-MAR-2007 15:39:26SHRD0108: Retrieved 110 records from gl_period_statuses
    << gllpst() 14-MAR-2007 15:39:27
    glldat() 14-MAR-2007 15:39:27Successfully built decode fragment for period_name and period_year
    gllbud() 14-MAR-2007 15:39:27SHRD0108: Retrieved 10 records from the budget tables
    << gllbud() 14-MAR-2007 15:39:27
    gllenc() 14-MAR-2007 15:39:27SHRD0108: Retrieved 15 records from gl_encumbrance_types
    << gllenc() 14-MAR-2007 15:39:27
    glldlc() 14-MAR-2007 15:39:27<< glldlc() 14-MAR-2007 15:39:27
    gllcvr() 14-MAR-2007 15:39:27SHRD0108: Retrieved 6 records from gl_daily_conversion_types
    << gllcvr() 14-MAR-2007 15:39:27
    gllfss() 14-MAR-2007 15:39:27LEZL0005: Successfully finished building dynamic SQL statement.
    << gllfss() 14-MAR-2007 15:39:27
    gllcje() 14-MAR-2007 15:39:27main_stmt:
    select int.rowid
    decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
    , '', replace(ccid_cc.SEGMENT2,'.','
    ') || '.' || replace(ccid_cc.SEGMENT1,'.','
    ') || '.' || replace(ccid_cc.SEGMENT3,'.','
    ') || '.' || replace(ccid_cc.SEGMENT4,'.','
    ') || '.' || replace(ccid_cc.SEGMENT5,'.','
    ') || '.' || replace(ccid_cc.SEGMENT6,'.','
    , replace(int.SEGMENT2,'.','
    ') || '.' || replace(int.SEGMENT1,'.','
    ') || '.' || replace(int.SEGMENT3,'.','
    ') || '.' || replace(int.SEGMENT4,'.','
    ') || '.' || replace(int.SEGMENT5,'.','
    ') || '.' || replace(int.SEGMENT6,'.','
    ') ) flexfield , nvl(flex_cc.code_combination_id,
    nvl(int.code_combination_id, -4))
    , decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
    , '', decode(ccid_cc.code_combination_id,
    null, decode(int.code_combination_id, null, -4, -5),
    decode(sign(nvl(ccid_cc.start_date_active, int.accounting_date-1)
    - int.accounting_date),
    1, -1,
    decode(sign(nvl(ccid_cc.end_date_active, int.accounting_date +1)
    - int.accounting_date),
    -1, -1, 0)) +
    decode(ccid_cc.enabled_flag,
    'N', -10, 0) +
    decode(ccid_cc.summary_flag, 'Y', -100,
    decode(int.actual_flag,
    'B', decode(ccid_cc.detail_budgeting_allowed_flag,
    'N', -100, 0),
    decode(ccid_cc.detail_posting_allowed_flag,
    'N', -100, 0)))),
    decode(flex_cc.code_combination_id,
    null, -4,
    decode(sign(nvl(flex_cc.start_date_active, int.accounting_date-1)
    - int.accounting_date),
    1, -1,
    decode(sign(nvl(flex_cc.end_date_active, int.accounting_date +1)
    - int.accounting_date),
    -1, -1, 0)) +
    decode(flex_cc.enabled_flag,
    'N', -10, 0) +
    decode(flex_cc.summary_flag, 'Y', -100,
    decode(int.actual_flag,
    'B', decode(flex_cc.detail_budgeting_allowed_flag,
    'N', -100, 0),
    decode(flex_cc.detail_posting_allowed_flag,
    'N', -100, 0)))))
    , int.user_je_category_name
    , int.user_je_category_name
    , 'UNKNOWN' period_name
    , decode(actual_flag, 'B'
         , decode(period_name, NULL, '-1' ,period_name), nvl(period_name, '0')) period_name2
    , currency_code
    , decode(actual_flag
         , 'A', actual_flag
         , 'B', decode(budget_version_id
         , 1210, actual_flag
         , 1211, actual_flag
         , 1212, actual_flag
         , 1331, actual_flag
         , 1657, actual_flag
         , 1658, actual_flag
         , NULL, '1', '6')
         , 'E', decode(encumbrance_type_id
         , 1000, actual_flag
         , 1001, actual_flag
         , 1022, actual_flag
         , 1023, actual_flag
         , 1024, actual_flag
         , 1048, actual_flag
         , 1049, actual_flag
         , 1050, actual_flag
         , 1025, actual_flag
         , 999, actual_flag
         , 1045, actual_flag
         , 1046, actual_flag
         , 1047, actual_flag
         , 1068, actual_flag
         , 1088, actual_flag
         , NULL, '3', '4'), '5') actual_flag
    , '0' exception_rate
    , decode(currency_code
         , 'EUR', 1
         , 'STAT', 1
         , decode(actual_flag, 'E', -8, 'B', 1
         , decode(user_currency_conversion_type
         , 'User', decode(currency_conversion_rate, NULL, -1, currency_conversion_rate)
         ,'Corporate',decode(currency_conversion_date,NULL,-2,-6)
         ,'Spot',decode(currency_conversion_date,NULL,-2,-6)
         ,'Reporting',decode(currency_conversion_date,NULL,-2,-6)
         ,'HRUK',decode(currency_conversion_date,NULL,-2,-6)
         ,'DALY',decode(currency_conversion_date,NULL,-2,-6)
         ,'HLI',decode(currency_conversion_date,NULL,-2,-6)
         , NULL, decode(currency_conversion_rate,NULL,
         decode(decode(nvl(to_char(entered_dr),'X'),'X',1,2),decode(nvl(to_char(accounted_dr),'X'),'X',1,2),
         decode(decode(nvl(to_char(entered_cr),'X'),'X',1,2),decode(nvl(to_char(accounted_cr),'X'),'X',1,2),-20,-3),-3),-9),-9))) currency_conversion_rate
    , to_number(to_char(nvl(int.currency_conversion_date, int.accounting_date), 'J'))
    , decode(int.actual_flag
         , 'A', decode(int.currency_code
              , 'EUR', 'User'
         , 'STAT', 'User'
              , nvl(int.user_currency_conversion_type, 'User'))
         , 'B', 'User', 'E', 'User'
         , nvl(int.user_currency_conversion_type, 'User')) user_currency_conversion_type
    , ltrim(rtrim(substrb(rtrim(substrb(int.reference1, 1, 50)) || ' ' || int.user_je_source_name || ' 2918896: ' || int.actual_flag || ' ' || int.group_id, 1, 100)))
    , rtrim(substrb(nvl(rtrim(int.reference2), 'Journal Import ' || int.user_je_source_name || ' 2918896:'), 1, 240))
    , ltrim(rtrim(substrb(rtrim(rtrim(substrb(int.reference4, 1, 25)) || ' ' || int.user_je_category_name || ' ' || int.currency_code || decode(int.actual_flag, 'E', ' ' || int.encumbrance_type_id, 'B', ' ' || int.budget_version_id, '') || ' ' || int.user_currency_conversion_type || ' ' || decode(int.user_currency_conversion_type, NULL, '', 'User', to_char(int.currency_conversion_rate), to_char(int.currency_conversion_date))) || ' ' || substrb(int.reference8, 1, 15) || int.originating_bal_seg_value, 1, 100)))
    , rtrim(nvl(rtrim(int.reference5), 'Journal Import 2918896:'))
    , rtrim(substrb(nvl(rtrim(int.reference6), 'Journal Import Created'), 1, 80))
    , rtrim(decode(upper(substrb(nvl(rtrim(int.reference7), 'N'), 1, 1)),'Y','Y', 'N'))
    , decode(upper(substrb(int.reference7, 1, 1)), 'Y', decode(rtrim(reference8), NULL, '-1', rtrim(substrb(reference8, 1, 15))), NULL)
    , rtrim(upper(substrb(int.reference9, 1, 1)))
    , rtrim(nvl(rtrim(int.reference10), nvl(to_char(int.subledger_doc_sequence_value), 'Journal Import Created')))
    , int.entered_dr
    , int.entered_cr
    , to_number(to_char(int.accounting_date,'J'))
    , to_char(int.accounting_date, 'YYYY/MM/DD')
    , int.user_je_source_name
    , nvl(int.encumbrance_type_id, -1)
    , nvl(int.budget_version_id, -1)
    , NULL
    , int.stat_amount
    , decode(int.actual_flag
    , 'E', decode(int.currency_code, 'STAT', '1', '0'), '0')
    , decode(int.actual_flag
    , 'A', decode(int.budget_version_id
    , NULL, decode(int.encumbrance_type_id, NULL, '0', '1')
    , decode(int.encumbrance_type_id, NULL, '2', '3'))
    , 'B', decode(int.encumbrance_type_id
    , NULL, '0', '4')
    , 'E', decode(int.budget_version_id
    , NULL, '0', '5'), '0')
    , int.accounted_dr
    , int.accounted_cr
    , nvl(int.group_id, -1)
    , nvl(int.average_journal_flag, 'N')
    , int.originating_bal_seg_value
    from GL_INTERFACE int,
    gl_code_combinations flex_cc,
    gl_code_combinations ccid_cc
    where int.set_of_books_id = 124
    and int.status != 'PROCESSED'
    and (int.user_je_source_name,nvl(int.group_id,-1)) in (('xx jab payroll', -1))
    and flex_cc.SEGMENT1(+) = int.SEGMENT1
    and flex_cc.SEGMENT2(+) = int.SEGMENT2
    and flex_cc.SEGMENT3(+) = int.SEGMENT3
    and flex_cc.SEGMENT4(+) = int.SEGMENT4
    and flex_cc.SEGMENT5(+) = int.SEGMENT5
    and flex_cc.SEGMENT6(+) = int.SEGMENT6
    and flex_cc.chart_of_accounts_id(+) = 50569
    and flex_cc.template_id(+) is NULL
    and ccid_cc.code_combination_id(+) = int.code_combination_id
    and ccid_cc.chart_of_accounts_id(+) = 50569
    and ccid_cc.template_id(+) is NULL
    order by decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
    , rpad(ccid_cc.SEGMENT2,30) || '.' || rpad(ccid_cc.SEGMENT1,30) || '.' || rpad(ccid_cc.SEGMENT3,30) || '.' || rpad(ccid_cc.SEGMENT4,30) || '.' || rpad(ccid_cc.SEGMENT5,30) || '.' || rpad(ccid_cc.SEGMENT6,30)
    , rpad(int.SEGMENT2,30) || '.' || rpad(int.SEGMENT1,30) || '.' || rpad(int.SEGMENT3,30) || '.' || rpad(int.SEGMENT4,30) || '.' || rpad(int.SEGMENT5,30) || '.' || rpad(int.SEGMENT6,30)
    ) , int.entered_dr, int.accounted_dr, int.entered_cr, int.accounted_cr, int.accounting_date
    control->len_mainsql = 16402
    length of main_stmt = 7428
    upd_stmt.arr:
    update GL_INTERFACE
    set status = :status
    , status_description = :description
    , je_batch_id = :batch_id
    , je_header_id = :header_id
    , je_line_num = :line_num
    , code_combination_id = decode(:ccid, '-1', code_combination_id, :ccid)
    , accounted_dr = :acc_dr
    , accounted_cr = :acc_cr
    , descr_flex_error_message = :descr_description
    , request_id = to_number(:req_id)
    where rowid = :row_id
    upd_stmt.len: 394
    ins_stmt.arr:
    insert into gl_je_lines
    ( je_header_id, je_line_num, last_update_date, creation_date, last_updated_by, created_by , set_of_books_id, code_combination_id ,period_name, effective_date , status , entered_dr , entered_cr , accounted_dr , accounted_cr , reference_1 , reference_2
    , reference_3 , reference_4 , reference_5 , reference_6 , reference_7 , reference_8 , reference_9 , reference_10 , description
    , stat_amount , attribute1 , attribute2 , attribute3 , attribute4 , attribute5 , attribute6 ,attribute7 , attribute8
    , attribute9 , attribute10 , attribute11 , attribute12 , attribute13 , attribute14, attribute15, attribute16, attribute17
    , attribute18 , attribute19 , attribute20 , context , context2 , context3 , invoice_amount , invoice_date , invoice_identifier
    , tax_code , no1 , ussgl_transaction_code , gl_sl_link_id , gl_sl_link_table , subledger_doc_sequence_id , subledger_doc_sequence_value
    , jgzz_recon_ref , ignore_rate_flag)
    SELECT
    :je_header_id , :je_line_num , sysdate , sysdate , 1008009 , 1008009 , 124 , :ccid , :period_name
    , decode(substr(:account_date, 1, 1), '-', trunc(sysdate), to_date(:account_date, 'YYYY/MM/DD'))
    , 'U' , :entered_dr , :entered_cr , :accounted_dr , :accounted_cr
    , reference21, reference22, reference23, reference24, reference25, reference26, reference27, reference28, reference29
    , reference30, :description, :stat_amt, '' , '', '', '', '', '', '', '', '' , '', '', '', '', '', '', '', '', '', '', ''
    , '', '', '', '', '', '', '', '', '', gl_sl_link_id
    , gl_sl_link_table
    , subledger_doc_sequence_id
    , subledger_doc_sequence_value
    , jgzz_recon_ref
    , null
    FROM GL_INTERFACE
    where rowid = :row_id
    ins_stmt.len: 1818
    glluch() 14-MAR-2007 15:39:27<< glluch() 14-MAR-2007 15:39:27
    LEZL0008: Found no interface records to process.
    LEZL0009: Check SET_OF_BOOKS_ID, GROUP_ID, and USER_JE_SOURCE_NAME of interface records.
    If no GROUP_ID is specified, then only data with no GROUP_ID will be retrieved. Note that most data
    from the Oracle subledgers has a GROUP_ID, and will not be retrieved if no GROUP_ID is specified.
    SHRD0119: Deleted 1 record(s) from gl_interface_control.
    Start of log messages from FND_FILE
    End of log messages from FND_FILE
    Executing request completion options...
    Finished executing request completion options.
    No data was found in the GL_INTERFACE table.
    Concurrent request completed
    Current system time is 14-MAR-2007 15:39:27
    ---------------------------------------------------------------------------

    As per the error message said, you need to specify a group id.
    as per documentation :
    GROUP_ID: Enter a unique group number to distinguish import data within a
    source. You can run Journal Import in parallel for the same source if you specify a
    unique group number for each request.
    For example if you put data for payables and receivables, you need to put different group id to separate payables and receivables data.
    HTH

  • BUG 3.0/3.1 Import data from CSV in non english local

    Hello,
    I have 3.0 on a German XP (NLS Decimal "," and Group ".") but with
    AddVMOption -Duser.language=en
    because of bug 9231534
    I try to import data from a delimited file that contains the number "123,23"
    When I use the Import Data Wizard it generates an insert with "12323.0". This import works, but gives me the wrong values, 12323 instead of 123,23.
    I have 3.1 with german UI and try the same import.
    It generates a correct insert "123.23", but on executing it fails, because it expects the german decimal separator, this is wrong for two reasons:
    The generated script has a dot as decimal separator and it would not make sense to use a comma, because this is the value separator in an insert script.
    To execute the script it should use the same NLS-Setting as for generating.
    Regards
    Marcus

    Hello,
    has anybody found a solution?
    Testcase: SQL Developer 3.1 on a German XP with default NLS Settings
    CREATE TABLE "TEST_TABLE"
           "NUM" NUMBER
          ,"VCH"  VARCHAR2(10 BYTE)
        ) ;Test file test_insert.dsv
    num;vch
    1;KL
    1,5;tz
    12345,45;ooImporting using the wizard inserts the first row correctly, for the others I get
    SET DEFINE OFF
    --Einfügen für Zeilen  1  bis  3  nicht erfolgreich
    --ORA-01722: invalid number
    --Zeile 2
    INSERT INTO TEST_TABLE (NUM, VCH) VALUES (1.5,'tz');
    --Zeile 3
    INSERT INTO TEST_TABLE (NUM, VCH) VALUES (12345.45,'oo');Beside the wrong umlaut in the message the insert statement itself is correct, because you cannot use the german decimal separator "," in the script. The bug is, that it should use the same NLS settings for generating and running the script.
    Regards
    Marcus

  • Error in import data

    in oracle enterprise console(login to the Oracle Management Server), when I use the wizard to import data, there is error in reading the import file(I use csv file).
    the error msg is:
    VNI-2015: The Node preferred credentials for the target node are either invalid or do not have sufficient privileges to complete the operation.
    On Windows platforms, the Node credentials specified for the Windows target should have the "Logon as a batch job" privilege.
    I don't know where to set the "Logon as a batch job", I only see the "Preferred Credentials" Tab in the menu of Configuration\preferences.
    anybody can help me? thanks in advance

    Hi
    What you can do is to check that the OS user submitting the job through OEM console is a local user on the database server and not a domain user. If the user is a domain user then your job submission will fail with VNI-2015 Authentication Failure. I have encountered this situation many times because of the same reason: the credentials for the node user were wrong.
    When adding local user on the database server - it has to be nodename\username and not domain_name\username. Assign log on as batch job and Administrator rights to the local user. This should fix the problem. May be you can check file permissions for the backup directories too
    Hope this helps
    Thank You

  • LSMW While importing data

    Hello, I'm using LSMW for transfering data, But I got an error on step 9. Import Data.
    I can't read error message text because it's not readable (shows in a wrong encoding,)
    The error doesn't  occur everytime. I use the same file but run it in a different period of time. I don't know the reason of that error.
    I know it's may be not enough info to solve my problem. But if someone was confronted with that pls notify me,
    Thanks.
    Edited by: kernel.panic on Sep 28, 2009 12:07 PM
    link to the error: http://s53.radikal.ru/i140/0909/e5/5a937bff1451.jpg

    no matter which codepage I am using.
    for example: when I specify input file to disk H: although my file is on disk C; I got the same error.
    I've launched debugger and found the line with error:
    FORM write_header_record.            
        CLEAR gs_lsheader.                 
        gs_lsheader-lsmw = 'LSMW'.         
        gs_lsheader-project = g_project.   
        gs_lsheader-subproj = g_subproj.   
        gs_lsheader-object = g_object.     
        gs_lsheader-systemid = sy-sysid.   
        gs_lsheader-client = sy-mandt.     
        gs_lsheader-datum = sy-datum.      
        gs_lsheader-uzeit = sy-uzeit.      
        gs_lsheader-uname = sy-uname.                                                                               
    * Begin: binary write                                 
        g_record_length = STRLEN( gs_lsheader ).            
        g_hex_record_length = g_record_length.              
        TRANSFER g_hex_record_length TO g_filename.  // AT THIS LINE ERROR OCCURS        
        ASSIGN g_max_buffer(g_record_length) TO <g_buffer>. 
        <g_buffer> = gs_lsheader.                           
        TRANSFER <g_buffer> TO g_filename.                  
      * End: binary write

  • Import data from text file

    We are trying to import data from a text file into fields in a form using the importTextData method for the Doc object. The script we are using looks like this:
    var doc = event.target;
    var returnCode = doc.importTextData("datafile.txt",0);
    The text file "datafile.txt" contains tab separated values in a header row (which corresonds to field names in the form) followed by tab separated datarows - all according to instructions in the API.
    The returnCode will result in an integer value, which in our case represent "Error: Invalid Row". It would help us a lot if anyone could give us a hint on what may be wrong or even better post a working solution.
    We have also tried using the importTextData method without any parameters (the user is then prompted to select the text file and then the specific row) but the result remains the same. The fields in the form are not populated with data and the message "Error: Invalid Row" is returned.
    Annika Lindqvist

    The problem you're experiencing is that importTextData is an Acrobat API call. As such, it executes on the AcroForm field equivalents to the XFA fields you've placed on the form. What's not indicated in the API for importTextData is that the names in the fields in the header of the text file must be the
    full AcroForm SOM expression.
    I've created a working sample to explain this.
    First, I created a form with a table with one row in it containing 3 columns named FirstName, LastName and Country.
    Then I created a data.txt file like this:
    FirstName LastName Country
    Fred Jones USA
    Kyle Francis Canada
    Sam Roberts UK
    I then placed a button on the form which calls "event.target.importTextData();" and ran the form in Preview.
    Of course, when I selected a row for import, nothing happened.
    I then used the "Avanced | Forms | Export Data from Form..." menu option in Acrobat after filling-in the fields in the table with some text and looked at the generated text file (note that you have to specify text as the output format). In there, I was able to figure-out what the full AcroFrom SOM expression was for each field I wanted to import data into.
    I changed my data file to look like this:
    form1[0].#subform[0].Table1[0].Row1[0].FirstName[0] form1[0].#subform[0].Table1[0].Row1[0].LastName[0] form1[0].#subform[0].Table1[0].Row1[0].Country[0]
    Fred Jones USA
    Kyle Francis Canada
    Sam Roberts UK
    Afther that, importTextData worked as expected without any errors.
    Stefan
    Adobe Systems

  • Import Data Tier Application Collation 1033 is not supported

    When I try to use Import Data-Tier Application I am getting an error, Could not load schema model from package (Microsoft.SqlServer.Dac)
    Additional information: Collation 1033 is not supported.  you must specify one of the supported collations in the Collation attribute.
    Everything was working fine for a while and then I started getting this error.
    I checked my system restore and there's a message "Language Pack Removal"
    I also believe this started happening after I installed Office 2013.   
    Thanks for your help,
    Keith

    Looking over your answer again, I don't see how anything could be wrong with the file. 
    1) I am able to import the file with no problem
    2) I update windows
    3) I am no longer able to update the file or any other .bacpac files and get the error:
    Could not load schema model from package (Microsoft.SqlServer.Dac)
    Additional information: Collation 1033 is not supported.  you must specify one of the supported collations in the Collation attribute.
    I have checked that SQL server has all of the correct Collation settings as well so I have no idea where to even start looking for this issue.

  • Batch change of import date of pictures?

    ola experts, i'm fairly new to aperture 3.1.1 .. after weeks of migrating 170 gigs of pix from iphoto '09 (8.1.2) due to iphoto 9.1.1 stuffing up my digital pix life, some of the events have their dates wrong at the then imported date to aperture..
    is there an "easy" way of batch processing the affected project pix to the original dates when created?
    thx francois, and no thanx to apple/aapl

    Have a look at *Metadata->Adjust Date and Time...*

  • Import data from r3

    Hi,
    1) I want to import customer data from R3 system.
    Note :- I am using repository created using sp3_customer.a2a archieve file in mdm
    I created idoc to file scanario in xi and xml file imported in import manager. but,my problem is i am not able to map all the fields one to one. So, I imported
    00_DEBMDM05_R3.map ( thru file--> import),option.
    I thought by doing this,mapping will happen automatically between xml and mdm data.but, not able to map automatically. What went wrong?
    I heard there is some other solution to import data from r3 ( adding ports, client systems etc in mdm)
    Thanks
    Rama

    Hi,
    Refer to
    <a href="http://help.sap.com/saphelp_mdmgds55/helpdata/EN/43/4e705aaee91bcce10000000a1553f7/content.htm">MDM and Client Systems</a>
    Hope that helps.
    Regards,
    Tanveer.
    <b>Please mark helpful answers</b>

  • Error in importing data from multiple ASCii files, Concatenat​e

    I am trying to use the "Importing Data From Multiple ASCII FIles.VBS" and the "Concatenate Groups.VBS" scripts downloaded from here, http://zone.ni.com/devzone/cda/epd/p/id/3870. When I run the importing data script and test it by highlighting the example data that comes with the download I get an error. "Error in <Importing Data From Multiple ASCII Files.VBS> (Line:60, Column: 11):  Variabls is undefined: 'AscIIAssocSet'  "
    The offending line of code is "  Call AsciiAssocSet(FilePaths(i), StpFilePath) ' assign STP file    "
    Screenshots of the error are attached.
    Can anyone tell me what this error is?  Am I just doing something wrong? Getting quite frustrated.
    Thanks in advance.
    Attachments:
    concactenate1.png ‏261 KB
    concactenate2.png ‏243 KB

    Please have a look at the DIAdem Example to concanate channels. Its delivered with DIAdem.
    Examples > Creating Scripts > Scripts > Appending Channels to Each Other
    Please use a dataplugin (File -> Text DataPlugin Wizard...) or the csv plugin to load your data.
    Greetings
    Andreas
    P.S.: The one you have downloaded needs the old Ascii Wizard that is not activated by default in newer DIAdem versions.
    If you want to use it anyway:
    Settings -> Options -> Extensions -> GPI Extensions ...
    Add gfsasci.dll

  • IPhoto wrong event date

    Right.
    So, I've recently added all my photos into iPhoto - great.
    But, the problem is...all my old photos were taken on a camera that didn't set a 'Date picture taken' attribute but does have a 'Date modified' attribute (close enough anyway). But, when I add these into iPhoto - they've all got the wrong event dates. Surely iPhoto, if it couldn't find a 'date picture taken' it would have gone with 'date modified' - like picasa 3 does.
    Any help?
    Cheers,
    Tom

    For future reference there's an application, PhotoInfo, that can batch change the EXIF date in photos, either by adding or subtracting a given time differential or by changing the entire date and time entirely. It would be best used on a batch of photos outside of iPhoto before importing. It's great for changing the time when you travel and take photos in a different time zone but didn't resset the camera's clock.
    TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto (iPhoto.Library for iPhoto 5 and earlier) database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
    I've created an Automator workflow application (requires Tiger or later), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. It's compatible with iPhoto 6 and 7 libraries and Tiger and Leopard. Just put the application in the Dock and click on it whenever you want to backup the dB file. iPhoto does not have to be closed to run the application, just idle. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.
    Note: There's now an Automator backup application for iPhoto 5 that will work with Tiger or Leopard.

  • Importing data from Cluster

    Hi,
    I am importing data from cluster PCL2, RT table for Indonesia country code is 34 and RELID is IS.
    The amount which i am getting in RT table is coming wrong. For Eg - 40,00,000 is coming as 40,000.
    Please help.
    Thanks in advance

    solved

  • Essbase data error importing data back to cube.

    Hi
    We have some text measures in cube.We loaded data using smart view.Once the data is loaded.we did export data from cube and when i import data back to the same cube i am getting the following error message below.
    Parallel data load enabled: [1] block prepare threads, [1] block write threads.
    Member [Sample text Measure] is a text measure. Only text values can be loaded to the Smart List in a text measure. [1004] Records Completed
    Unexpected Essbase error 1003073.
    I have done some analysis on this error message but could not find any solution.
    please suggest me where it is going wrong.
    Thanks in advance
    Kris
    Edited by: user9363364 on Apr 19, 2010 5:19 PM
    Edited by: user9363364 on Apr 20, 2010 3:38 PM
    Edited by: user9363364 on Apr 21, 2010 1:55 PM

    Hi,
    I don't know if I'm directing you correctly as I only loaded data, but never imported the exported text data. However, from the 11.1.1.3 dbag, below is found on P#193:
    *"100-10" "New York" "Cust Index" #Txt:"Highly Satisfied"*
    This shows that- Perhaps, we may've to prefix *#Txt:* before the double-quoted Text value.
    Wanna give a try?

Maybe you are looking for

  • Installation Problem Sun Java App Server 8.2 PE

    Hello! every body Please help me to get out of this Monster :( For last 2 day i m trying to install Sun Java Application Server 8.2 PE. Installing & uninstalling Dozens of time. The Installation halted at *51% at Uncompressing sun-as-jsr88-dm.jar* I

  • Help me buy a new computer please :)

    Hi. I gonna buy a new computer soon. I've heard alot problems about Neo4 Plat and World of Warcraft which is the only game iam playing, and the only reason iam buying a new system. Can you please explaing your experience which WoW+Neo4 Plat ? I also

  • Adobe Photoshop CS5 3D Menu missing, need solution ASAP

    I recently made a 3D Wallpaper and when im gonna make one again the 3D menu is gone. I don't know if it is because of update  or something, but when i searched for an answer or a solution in Google, i've found multiple comments regarding the license

  • Recording from minidisc to Logic pro 7 via an fa-66

    Hi, I'm having issues recording from a sony minidisc deck through my edirol fa-66 to my macbook. I've tried the various anologue inputs on the fa66, the 1/4 jacks and the RCA inputs on the back. With the anologue it's sounds fine until I arm the trac

  • Where is the account to execute ssis in file system when run a job

    Hi I have a ssis package in file system, and create a job, in general... the section package:... select "File System". but it thrown: access denegate: the user should be administrator.. what is the account or where is it, that use to run the job in a