Access 2010 InfoPath Data Collection Export Fails Due to Date Format That Includes Time Zone

I created an Access 2010 database that has multiple data collection (InfoPath) forms that were generated from Access and have been in use for about 1.5 years.  Starting in 2013 (for about a week now), the submitted data fails to Export due
to a "data type conversion error" with the date fields.  Prior to 2013, the date string returned in the InfoPath form looked like this: "2013-01-07T00:00:00", but now it looks like this: "2013-01-07T00:00:00-05:00".  The time zone is appended
to the string and it kills the Outlook Export feature.
To test this, I created a new database with one table and one date time field.  I generated an InfoPath template and emailed it to myself.  I entered the date into the template and submitted it (tried manually entering the date as well as
using the date time picker control - made no difference).  The InfoPath template now contains "2013-01-07T00:00:00-05:00" and will not Export from Outlook to Access.  I tried manually pasting the string into the Access table and it would take it,
but would show "1/7/2013 5:00:00 AM" in the date time field (which isn't correct either but at least it took).  Note: This problem has appeared at my office (Win 7 with Office 2010), but my testing was done on my personal laptop using Win 8 with Office
2010.
It looks like Microsoft has created a bug and now all of my data collection forms are unusable.  Any help will be appreciated.

Microsoft confirmed that the issue was introduced with MS12-066 as follows:
***Start Quote
We have been able to identify that this issue was introduced with the change made for the hotfix detailed in KB 2687395. This update was included in the security update MS12-066 when you installed KB 2687417.
2687395          Description of the InfoPath 2010 hotfix package (ipeditor-x-none.msp): August 28, 2012
http://support.microsoft.com/kb/2687395/EN-US
2687417           MS12-066: Description of the security update for InfoPath 2010 Service Pack 1: October 9, 2012
http://support.microsoft.com/kb/2687417/EN-US
Investigating workarounds I've only come up with using HTML forms or changing the datatype of the control to text.
***End Quote
My own testing also indicates that if you are using InfoPath with SQL Server, you may be able to change the Date/Time picker control in InfoPath to a Date only picker control (if the SQL Server data
type will support that).

Similar Messages

  • OMF export failed due to internal error

    I get this error message whatever I do when I try to export as OMF:
    "OMF export failed due to internal error"
    I tried OMF 1 and 2.
    I tried to uncheck "include audio".
    I tried an extremely small project (one region, one track).
    Always the same error. Problem is there's absolutely no information on what kind of internal error this is.
    All other export options (AFF, Open TL and Final Cut) works.
    Any ideas?

    I suppose that beside CCM 4.1(2) you are using AC 1.4 (1), if yes, then you need to upgrade to AC 1.4 (1) ES11 or upgrade at least to CCM 4.1(3)ES42 or CCM 4.1(3) sr3c
    HTH
    //Jorge

  • 'OMF export failed due to internal error' message

    Hi,
    Has anyone ever got an 'OMF export failed due to internal error' message when trying to export in OMF format?
    I get that every time I try to export in that format.
    How can I make it work?
    Thanks and best,

    Hello sj holcombe,
    And welcome to Apple Discussions!
    Have you had a chance to check out this Apple support document directly related to this issue? If not, it is a good place to start.
    [iTunes displays a -69 error when syncing iPod|http://support.apple.com/kb/HT1210]
    Hope it helps.
    B-rock

  • How can I generate a data merge that includes a barcode from excel spreadsheet to InDesign?

    Can you please advise if the following process is possible within InDesign CS 5.5? (Mac OS 10.8.2).
    We need to create a data merge that includes a barcode. The data merge is for several thousand records.
    We have a customer-supplied Excel spreadsheet that contains seven fields, one of which is a number that ultimately needs to print as a barcode when we make the data merge in InDesign.
    As you would expect, the conventional six fields data merge perfectly - but we can't find a way to make the barcode print correctly within the data merge.
    We have investigated several possible solutions without success.
    Some solutions suggest a plug-in for Excel to convert the number into a barcode whilst still in the spreadsheet - but this doesn't come across correctly when we import the CSV file into the InDesign data merge.
    We have investigated certain third party software applications but none provide the full solution we require. Some, for example, appear to create the barcode within the Excel spreadsheet but don't produce the barcode correctly when we data merge in InDesign; and some will allow us to create a one-off barcode within InDesign but have no automated way of retrieving the thousands of Excel numbers and turning them into barcodes.
    We have also come across possible 'font' solutions but have had no success with these either.
    For a previous job we needed to produce we outsourced to a specialist mailing company, who assured us they produced the whole job (very quickly) within InDesign. They, understandably, won't tell us how they did it. But, given the speed with which they produced the job for us, we imagine - with the correct software - the whole process is easy to do.
    Can you please advise?

    See second posting atHow do I generate a data merge in InDesign from an excel spreadsheet which includes a barcode

  • Is PGI date beased on system date or user profile time zone date?

    Hi All,
    When doing PGI what date system will take as Actual good issue date ? whether the System Date or User local Time zone date which maintained in User profile. If you have any idea on this please share.
    Regards,
    Lakshmikanth

    Hi,
    You need to check with the basis team for settings this to your local time.
    You can also check the field Personal time zone of the user in t.code SU01. I am not sure this will update the timings in the database when the user performs some action.
    Regards

  • Getting Error in Data Collection - FDPSTP failed due to ORA-23401

    Hi All,
    I am really new to ATP Setup and after referring to the Metalink notes, I have compelted the setups required to calculate ATP in Sales Order Window.
    But when I am submitting the Data Collection which is the last step I guess, the program is ending with Error.
    I am getting the error.
    FDPSTP failed due to ORA-23401: materialized view "APPS"."MTL_OH_QTYS_SN" does not exist.
    And because of this, when I am going to Sales Order to check the ATP, I am getting the error that Try Calculating ATP Again.
    Please Help. Project is into a critical stage and I am stuck with this error.
    With Regards,
    Vishal Majithia

    Please mention the application release.
    I am really new to ATP Setup and after referring to the Metalink notes, I have compelted the setups required to calculate ATP in Sales Order Window.What is the document you are referring to?
    I am getting the error.
    FDPSTP failed due to ORA-23401: materialized view "APPS"."MTL_OH_QTYS_SN" does not exist.Please see if these docs help.
    Refresh Snapshot Errors With ORA-12034: Snapshot Log On "BOM"."BOM_SUB_OPERATION_RESOURCES" Younger Than Last Refresh [ID 180705.1]
    How to Rebuild Source Snapshots Used in the Data Collection or Planning Data Pull MSCPDP [ID 148584.1]
    Thanks,
    Hussein

  • ODBC from Access 2010 - where have my exported tables gone?

    Story so far:
    Environment Windows 8.1 Office 2013 both 64 bit
    Got an Access 2010 DB which is too big (1.8GB), so I want to do an SQL Express back end and an Access front end.
    I install SQL 2012 Express on my local system CCW8.
    I create a DB in SQL Management Studio (called OUS). In Access 2013 I create an ODBC datasource (it does not ask for 32 or 64 bit) called CCW8\OUS
    I can export tables to this daatsource
    I delete the local tables in Access.
    I attach the dbo.TABLENAME SQL tables
    My data is visible.
    But I need to find these tables in SQL Management Studio to do the database diagrams. I can't find them
    My OUS database exists but has no tables.
    After a reboot, the datasource is no longer visible in ODBC datasouces (32 or 64 bit) but the Access DB can still "see" some data
    Can someone point me to a reference?
    Also should I stop using Access as a front end - it's easy for users coming from Excel and Office VBA?
    CarolChi

    Hello,
    It seems that Access data export to ODBC data source is failed.
    Did you try to export Access data into SQL Server database with another methods? For example, using SQL Server Migration Assistant for Access or SQL Server Import and Export Wizard from SQL Server Management Studio?
    In order to use Access as front end for a SQL Server database, you can build a OBDC connection to your SQL Server database and create linked table in Access.
    Reference:http://office.microsoft.com/en-in/access-help/link-to-sql-server-data-HA102809758.aspx
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click
    here.
    Fanny Liu
    TechNet Community Support

  • Creating a fiscal date table that includes Day of Year and Week of Year column

    Hi,
    Our fiscal year runs between 1st October through to 30th September and I'm trying to create a date table that I can use to represent this in Powerpivot (I'm using Excel 2013). I had no probably creating columns for Fiscal month or fiscal year, however I am
    having issue breaking it down more than that, specifically I cannot work out how to calculate:
    - day of the year
    - week of the year (with week one starting on the first day of the year rather than the first Monday)
    I worked out how to do the day of the year until I had to try and account for leap years which broke the formula I had! Can anyone please recommend solutions, I cannot see anything online about this. Obviously I know that it is simple to do this for a normal
    calendar year but this approach doesn't work with a custom fiscal year.
    Whilst it is useful for several reasons, my main reason for wanting week is so that I can plot my Pivotcharts more smoothly (although I may have missed an easier way to do that).
    Thanks,
    James

    For my example I started with 1 column in Power Pivot called [Date]
    I created the following Columns
    [DayofMonth] =Day([Date])
    [MonthNumber] =Month([Date])
    [Fiscal Year] =IF([MonthNumber]< 10, Year([Date]), (Year([Date])+1))
    [Fiscal Day] =[Date] - DATE(([Fiscal Year] -1), 10,1) +1
    [Fiscal Week] =RoundUp([Fiscal Day] / 7, 0)
    Change [Date] to whatever your date column is and those DAX formulas will get you what you need

  • Using USB NI-6008 on LV7.1 the results data has to be in Excel. It looks as cluster data format that i can not insert into excel VI

    I am Using USB 6008 with LV 7.1. The application is voltage logger and i would like to use excel and upload the data . The Columns will be the channels and the rows will be the data.
    The data comes from the DAQmx vi not like a simple string or an number it is kind of cluster data
    which contains strings and numbers and more for each measurement.
    I tryed to work with spraedshit and it goes ok but it is not conveniant to convert later to excel.
    Do you have a solution for that please?
    (How to extract the data from the cluster type format that is used with USB6008).
    TX arik
    Attachments:
    USB-6008-r3.vi ‏80 KB

    Arik:
    I hope all is well. If you are trying to write data to a text file along with the header information, you could do the following:
    1. Open a reference to a file and write the header information right away.
    2. Do your continuous acquisition (in a loop) using the 6008 and write the data to a file everytime.
    3. With the proper formatting, your colums will represent channels and your rows data.
    Please take a look at the attached example and let me know if you have any other questions.
    Regards,
    Rudi N.
    Attachments:
    WriteFilewHeaders.vi ‏64 KB

  • Custom date format that can use in Obiee 11g

    Hi,
    I'm new here,
    What are the date format could be use in OBIEE 11g?
    can we custom like this: DD-MM-YYYY?is there any case sensitive like dd-MM-yyyy?
    Please help
    Joe
    Edited by: JoeSSI on Mar 1, 2012 8:32 PM

    Hi,
    bot should work.refer the below
    go to u r required columns -->right click and properties then select data format tab --> check it Override Default Data Format > select it custom option put u r required both option work.
    if u kept it
    DD-MM-YYYY ===> 01-03-2012 format u ill get
    or
    dd-MM-yyyy
    Thanks
    Deva

  • AD MA Large Group (9000+ users) export failing due to connectivity timeout:: cd-connectivity-error

    I am currently implementing FIM 2010 in a test environemnt. I have MAs connected to the FIM Portal, an Oracle view, AD, and eDirectory 8.8SP6. All agents are working fine, users attribute flow is working great to all connected systems and groups are flowing
    into the MV without issue. I have a few criteria-based groups in the FIM portal that I am synchronizing to AD. All but one of those groups is synchronizing correctly. The failing group contains by far the most members of any group I've attempted to export
    so far, numbering around 9000 members. The only error I receive on an export run on the AD MA is "cd-connectivity-error" for the group object itself and the agent shows a run status of "dropped-connection." Event Viewer simply shows:
    The management agent "Test Active Directory MA" failed on run profile "Export" because of connectivity issues.
    Additional Information
    Discovery Errors : "0"
    Synchronization Errors : "0"
    Metaverse Retry Errors : "0"
    Export Errors : "1"
    Warnings : "0"
    User Action
    View the management agent run history for details.
    While 9000 members in a group is not insubstantial I can't help but feel like this should work. Are there configuration changes I need to make to AD in order to allow a longer timeout on the connection itself? I have tried increasing the timeout on the export
    run profile without success. Any guidance would be greatly appreciated.
    ~Jeff

    @Evgeniy Oddly enough, that is exactly the timeout change I made - 300 seconds. I am currently populating the group throung the FIM Portal. It's a criteria-based group. Basically I am taking every employee status type (we have ~12 of them) and I am creating
    sub-groups for each. The largest one of these is the one I'm timing out on. The second largest one has ~2500 members and functions as expected. If I need to change this to provision with .dll code I certainly can - I was just hoping to provide some transparency
    for our support staff and my fellow admins on these groups.
    @Carol Good idea. I will do just that. I have 2 DC's isolated in a site with an exorbitantly high site link cost specifically to service our LDAP clients and other back end services. Those 2 DC's also happen to host the majority of our FSMO roles here as
    well. I'll hardocde the link to one of those DC's and report back.
    As a secondary question, would it be better perhaps if I simply used set transitions on the user accounts to individually add them to these groups? I don't plan on using employee status for any other functions beyond this one so it seems like it may be more
    complicated to go that route but it may also be more stable.
    Thanks for all the great replies! I'll keep you posted.

  • Back Up Failed Due to lack of space on Time Capsule

    I purchased a new mac...was able to recover the data off my time capsule just fine...then returned the mac after getting my old mac back from repair....and I haven't been able to back up to my time capsule since...I keep recieving the following error...
    This backup is too large for the backup disk. The backup requires 224.69 GB but only 178.45 GB are available.
    Time Machine needs work space on the backup disk, in addition to the space required to store backups. Open Time Machine preferences to select a larger backup disk or make the backup smaller by excluding files.
    I have no problems removing data from my time capsule to make room for a back up, but I don't know how...I just don't understand why it's not removing the old backups automatically to create the new back as it did in the past before I switch b/w the new mac back to my old mac...
    Please advise...Thank you

    Download http://support.apple.com/kb/DL148
    Earse the time capsule then do a back up..

  • How can I import data from history including time of visit to site?

    When I open my "History" tab I can then select from the left hand column, for example "last seven days" and then select "copy" from the "Organize" tab. But when I paste into an Excel worksheet, the only information that shows up is the website. Is there a way to also import the "visit date" column?

    You would have to use an extension like SQLite Manager to export the history as CSV.
    *[[/questions/835204]]
    You can add other columns when needed.

  • Timestamp formats that include literal data in quotes

    Is there an alternate way to specify literal text in timestamp formats, other than to quote it with double quotes? I am trying to work around a bug in SQLDeveloper (described below).
    The formats I am using are:
    timestamp: YYYY-MM-DD"t"HH24:MI:SS.FF6
    timestamp tz: YYYY-MM-DD"t"HH24:MI:SS.FF6TZH:TZM
    These formats work fine in sqlplus and in my application:
    SQL> alter session set nls_timestamp_format = 'YYYY-MM-DD"t"HH24:MI:SS.FF6';
    Session altered.
    But when I use these them in the Database->NLS Parameters section in Preferences, I get the following error:
    ORA-02248: invalid option for ALTER SESSION
    If I remove the "t" from the middle, there are no errors in SQLDeveloper and the formats work correctly in the sql worksheet.
    SQLDeveloper: 1.1.2.25 on Windows XP Pro
    Oracle server: 9.2.0.5.0, 64bit running on Solaris
    Thanks,
    Philip

    Hi JW,
    if you want to track changes, implement the how-to paper about "line-items" (see SAP Service Marketplace). BPS layout will always show the latest data (price) and to show the history, you can use a simple BW query with drill-down on the timestamp.
    Regards
    Marc
    SAP NetWeaver RIG

  • Load getting failed due to short dump

    Hi All,
    Process Chain is getting failed due to following short dump Include Report LFZI_CA_PCA_DERIVATIONSU37 not found. Please suggest on rectifying this dump. Its a delta load.
    Regards,
    Asim

    Hi All
    As requested please find details short dump below. And datasource name is this GECPC_GLPCP and its extractor is ZPCC_V_GLPCP.
    Short text
       Syntax error in program "SAPLZFI_CA_PCA_DERIVATIONS ".
    hat happened?
       Error in the ABAP Application Program
       The current ABAP program "GP9IQHCDEPFH201E24O25IAYXXW" had to be terminated
        because it has
       come across a statement that unfortunately cannot be executed.
       The following syntax error occurred in program "SAPLZFI_CA_PCA_DERIVATIONS " in
        include "LZFI_CA_PCA_DERIVATIONSUXX " in
       line 77:
       "INCLUDE report "LZFI_CA_PCA_DERIVATIONSU37" not found"
       The include has been created and last changed by:
       Created by: "SAP* "
       Last changed by: "SAP* "
       Error in the ABAP Application Program
       The current ABAP program "GP9IQHCDEPFH201E24O25IAYXXW" had to be terminated
        because it has
       come across a statement that unfortunately cannot be executed.
    What can you do?
        Please eliminate the error by performing a syntax check
        (or an extended program check) on the program "SAPLZFI_CA
        You can also perform the syntax check from the ABAP Edito
        If the problem persists, proceed as follows:
        Note down which actions and inputs caused the error.
        To process the problem further, contact you SAP system
        administrator.
        Using Transaction ST22 for ABAP Dump Analysis, you can lo
        at and manage termination messages, and you can also
       keep them for a long time.
    Error analysis
        The following syntax error was
        "INCLUDE report "LZFI_CA_PCA_DE
    How to correct the error
        Probably the only way to elimin
        If you cannot solve the problem
        notification to SAP, include th
        1. The description of the curre
           To save the description, cho
        (Unconverted)".
        2. Corresponding system log
    Display the system log by callin
    Restrict the time interval to 10
    ter the short dump. Then choose "
    nconverted)".
    If the problem occurs in a probl
    ogram: The source code of the pro
    In the editor, choose "Utilities
    ilities->Upload/Download->Downloa
    Details about the conditions und
    tions and input led to the error.
    Information on where terminated
        Termination occurred in the ABAP program "GP9IQHCDEPFH201E24O25IAYXXW" - in
         "START_ROUTINE".
        The main program was "RSBATCH_EXECUTE_PROZESS ".
        In the source code you have the termination point in line 102
        of the (Include) program "ZFICA_DERIVE_PCA_OBJECTS_S".
        The program "GP9IQHCDEPFH201E24O25IAYXXW" was started as a background job.
        Job Name....... "BIDTPR_6726959_1"
        Job Initiator.. "ALEREMOTE"
        Job Number..... 17271800
    Line  SourceCde
       72 ASSIGN 'L_DERIVE_CHRT_ACCTS-UPD_CHRT_AC
       73 assign (<derive_chrt_accts_field>) to
       74 IF <derive_chrt_accts> IS ASSIGNED.
       75   zzi_attributes_upd-chrt_accts_upd = '
       76 ENDIF.
       77 zzi_attributes_upd-/bic/gactivity_upd
       78 zzi_attributes_upd-/bic/gfactv_upd
       79 zzi_attributes_upd-/bic/gglobacct_upd
       80 zzi_attributes_upd-/bic/ggforacc_upd
       81 zzi_attributes_upd-/bic/ggrracc_upd
       82 zzi_attributes_upd-/bic/ggra_code_upd
       83 zzi_attributes_upd-/bic/gsegmtion_upd
       84 zzi_attributes_upd-/bic/ggaapflg_upd
       85 zzi_attributes_upd-/bic/gsub_item_upd
       86 zzi_attributes_upd-/bic/gsegmnt_upd
       87 zzi_attributes_upd-/bic/gacctgrp_upd
       88 zzi_attributes_upd-comp_code_details_up
       89 zzi_attributes_upd-gbfi0600_upd
       90 zzi_attributes_upd-/bic/zaccsuf_upd
       91 zzi_attributes_upd-/bic/gconunit_upd
       92 zzi_attributes_upd-/bic/gtpunit_upd
       93 *--------------------------------------
       94 * Store a copy of the source fields if
       95 *--------------------------------------
       96 MOVE-CORRESPONDING <source_fields> TO z
       97 CLEAR zze_attributes.
    Runtime Errors         SYNTAX_ERROR
    Date and Time          04.08.2014 17:27:39
        98 *----------------------------------------------------------------------*
        99 * Derive all PCA fields for reporting
       100 *----------------------------------------------------------------------*
       101
    >>>>> CALL FUNCTION 'ZFI_CA_DERIVE_PCA_ATTRIBUTESII'
       103   EXPORTING
       104     i_zfi_ca_pca_attributes     = zzi_attributes
       105     i_zfi_ca_pca_attributes_upd = zzi_attributes_upd
       106     msgty                       = 'I'
       107     recno                       = <source_fields>-record
       108   IMPORTING
       109     e_zfi_ca_pca_attributes     = zze_attributes
       110   TABLES
       111     monitor_recno               = lt_monitor_recno.
       112
       113 *----------------------------------------------------------------------*
       114 * On the recurrsive load delete entries where the data is the same
       115 *----------------------------------------------------------------------*
       116 IF zzg_realignment_run = 'X'.
       117   IF  zzi_attributes = zze_attributes.
       118     DELETE  source_package INDEX la_tabix.
       119   ELSE.
       120     MOVE-CORRESPONDING zze_attributes TO <source_fields>.
       121   ENDIF.
    SY-SUBRC 0
    SY-INDEX 3
    SY-TABIX 1
    SY-DBCNT 10
    SY-FDPOS 0
    SY-LSIND 0
    SY-PAGNO 0
    SY-LINNO 1
    SY-COLNO 1
    SY-PFKEY
    SY-UCOMM
    SY-TITLE Execute Batch Proces
    SY-MSGTY E
    SY-MSGID R7
    SY-MSGNO 057
    SY-MSGV1 ZBAL_LC
    SY-MSGV2 A
    SY-MSGV3
    SY-MSGV4
    SY-MODNO 0
    SY-DATUM 20140804
    SY-UZEIT 172720
    SY-XPROG SAPCNVE
    SY-XFORM CONVERSION_EXIT
    No.   Ty.          Program                             Include                             Line
          Name
        8 FUNCTION     GP9IQHCDEPFH201E24O25IAYXXW         ZFICA_DERIVE_PCA_OBJECTS_S            102
          START_ROUTINE
        7 METHOD       GP9IQHCDEPFH201E24O25IAYXXW         ZFICA_DERIVE_PCA_OBJECTS_S            102
          LCL_TRANSFORM=>START_ROUTINE
        6 METHOD       GP9IQHCDEPFH201E24O25IAYXXW         GP9IQHCDEPFH201E24O25IAYXXW          1299
          LCL_TRANSFORM=>EXECUTE
        5 METHOD       CL_RSTRAN_TRFN_CMD============CP    CL_RSTRAN_TRFN_CMD============CM005    70
          CL_RSTRAN_TRFN_CMD=>IF_RSBK_CMD_T~TRANSFORM
        4 METHOD       CL_RSBK_PROCESS===============CP    CL_RSBK_PROCESS===============CM00Q    93
          CL_RSBK_PROCESS=>PROCESS_REQUEST
       3 METHOD       CL_RSBK_PROCESS===============CP    CL_RSBK_PROCESS===============CM002    13
          CL_RSBK_PROCESS=>IF_RSBATCH_EXECUTE~EXECUTE
        2 FUNCTION     SAPLRSBATCH                         LRSBATCHU13                           437
          RSBATCH_EXECUTE_PROCESS
        1 EVENT        RSBATCH_EXECUTE_PROZESS             RSBATCH_EXECUTE_PROZESS               374
          START-OF-SELECTION
    REQUEST
        DTPR_1FR9TKO2U6M616ATG9C2EYKRH
        455553453544353433345434345454
        4402F16294BF256D61614793259B28
        000000000000000000000000000000
        000000000000000000000000000000
    DATAPACKID
        000001
        333333
        000001
        000000
        000000
    MONITOR
        Table[initial]
    SOURCE_PACKAGE
        3000000001012P2013K4AU0ACD 2013005          00                                               0
        3333333333333533334345344423333333222222222233222222222222222222222222222222222222222222222223
        300000000101202013B415013402013001000000000000000000000000000000000000000000000000000000000000
        0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
        0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
        3333333333333533334345344423333333222222222233222222222222222222222222222222222222222222222223
        300000000101202013B415013402013002000000000000000000000000000000000000000000000000000000000000
        0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
        0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
        3333333333333533334345344423333333222222222233222222222222222222222222222222222222222222222223
        300000000101202013B415013402013003000000000000000000000000000000000000000000000000000000000000
        0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
        0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
        3333333333333533334345344423333333222222222233222222222222222222222222222222222222222222222223
        300000000101202013B415013402013004000000000000000000000000000000000000000000000000000000000000
      00000000000000000000000000000000000
      00000000000000000000000000000000000
      33333333333335333343453444233333332
      300000000101202013B4150134020130050
      00000000000000000000000000000000000
      00000000000000000000000000000000000
      Table IT_110723[600000x1144]
      {A:114*\TYPE=%_T00003S00000115O0000
      Table reference: 3570
      TABH+  0(20) = 90A161080000000060C1
      TABH+ 20(20) = F20D000083B00100C027
      TABH+ 40(16) = 03730000242D00000400
      store        = 0x90A1610800000000
      ext1         = 0x60C1190C00000000
      shmId        = 0     (0x00000000)
      id           = 3570  (0xF20D0000)
      label        = 110723 (0x83B00100)
      fill         = 600000 (0xC0270900)
      leng         = 1144  (0x78040000)
      loop         = 128   (0x80000000)
      xtyp         = ROL.115.11556
      occu         = 4     (0x04000000)
      accKind      = 1     (ItAccessStand
      idxKind      = 0     (ItIndexNone)
      uniKind      = 2     (ItUniNo)
      keyKind      = 1     (default)
      cmpMode      = 8     (cmpManyEq)
      occu0        = 1
      stMode       = 0
      groupCntl    = 0
      rfc          = 0
      unShareable  = 0
      mightBeShared = 1
      sharedWithShmTab = 0
      isShmLockId  = 0
      isUsed       = 1
      isCtfyAble   = 1
      hasScndKeys  = 0
      hasRowId     = 0
      scndKeysOutdated = 0
      scndUniKeysOutdated = 0
      ----- Shareable Table Header Data -----
      tabi         = 0xB008190C00000000
      pgHook       = 0x404CE73B00000000
      idxPtr       = 0x0000000000000000
      id           = 61767 (0x47F10000)
      shmTabhSet   = 0x0000000000000000
      refCount     = 1     (0x01000000)
      tstRefCount  = 0     (0x00000000)
      lineAdmin    = 606204 (0xFC3F0900)
      lineAlloc    = 600004 (0xC4270900)
      shmVersId    = 0     (0x00000000)
      shmRefCount  = 2     (0x02000000)
      rowId        = 18446744073709551615
      scndKeyAdmin = 0x0000000000000000
      ----- 1st level extension part --------
      regHook      = 0xC8C0190C00000000
      collHook     = 0x0000000000000000
      ext2         = 0x0000000000000000
    RI-CLUSTR
       0
       00
       00
    ZE_ATTRIBUTES
                         00000
       22222222222222222233333
       00000000000000000000000
       00000000000000000000000
       00000000000000000000000
    ZI_ATTRIBUTES
       AU0A    070221000020130
       45342222333333333333333
       15010000070221000020130
       00000000000000000000000
       00000000000000000000000
    ZI_ATTRIBUTES_UPD
       XXXXX XXXXXX  XX XXXX
       55555255555522552555522
       88888088888800880888800
       00000000000000000000000
       00000000000000000000000
    SOURCE_FIELDS>-RECORD
       1
       0000
       1000
    T_MONITOR_RECNO
       Table[initial]
    ZG_REALIGNMENT_RUN
       2
    Regards,
    Asim
    GECPC_GLPCP   

Maybe you are looking for

  • My iPhone 4 + iOS 4.2 = iBrick

    Upgrading to 4.2 from 4.1_8B117 (the last version with no jailbreak) in a MacBook Pro with all patches (including the last version of iTunes). iTunes alert me that an "unknown error (10)" has been reported (even if I try to restore). I've deleted the

  • Dynamic Mail Messenger - mail to distribution list

    Hi Mentors, The requirement is to generate an automatic mail when an employee updates his marital status or children status in IT0002. The mail has to go to the master data administrator and one of the persons in the Accounts team to initiate insuran

  • Public user sees old content of items in content area/portlets

    I have items in my Content Area, as pl/sql folders, text and pl/sql items. These are all published as portlets. Access is set to public. The content of these items have been altered a couple of times during development, and they work as they should w

  • Upload BI-0PS data

    Hello guys!   I´m trying to upload BI-0PS data, like: Network,Activity and work elements.   I need upload 'Work' and 'Actual Work' datas. I was searching and I saw it can be easily update with standar infosources  Network dates (0PS_DAT_NTW), Network

  • "Ordered" Quantity Wrong in Item Master

    I have a client that found the "ordered" quantity in the item master is incorrect for a specific item.  After some research, I have found that this is the scenario that caused it: 1.) Production Order created with planned quantity of 1. 2.) Reported