Error in one of data mover scripts during campus solution installation

Hi everybody,
This is my first attempt to install one of peoplesoft products
I am installing HCM 9.0 on windows 2008 64 bit, oracle 11g
I am now in task called 7A-16-9: Updating PeopleTools System Data
I ran pt849tls.dms successfully
but pt850tls.dms failed with error:
File: Data MoverSQL error. Stmt #: 0 Error Position: 25 Return: 904 - ORA-00904: "PT_RETENTIONDAYS": invalid identifier
Failed SQL stmt:UPDATE PS_PRCSSYSTEM SET PT_RETENTIONDAYS=RETENTIONDAYS
Error: SQL execute error for UPDATE PS_PRCSSYSTEM SET PT_RETENTIONDAYS=RETENTIONDAYS
Is there a script that I have missed?
I followed the instructions step by step
Thanks for your help,,,

I highly appreciate your efforts,
I ran rel849un.sql and rel850un.sql successfully (I checked the corresponding log files without errors)
table description is:
CREATE TABLE "SYSADM"."PS_PRCSSYSTEM"
(     "OPSYS" VARCHAR2(1 CHAR),
     "RETENTIONDAYS" NUMBER(*,0),
     "PRCSPURGENEXTDTTM" TIMESTAMP (6),
     "RECURDTTM" TIMESTAMP (6),
     "PRCSPURGERECURNAME" VARCHAR2(30 CHAR),
     "PURGEPRCSFILES" NUMBER(*,0),
     "ARCH_PROCESSED" VARCHAR2(1 CHAR),
     "PRCSSYSLOADOPT" VARCHAR2(2 CHAR),
     "VERSION_UPDATED" VARCHAR2(1 CHAR))
Best Regards,,,

Similar Messages

  • Problems Generating Data Move Scripts in v1.5.1

    Hi, I'm having problems when trying to generate data move scripts in SQL Dev 1.5.1 to carry out an off-line data load. I'm carrying out a migration from Sybase to Oracle and the database I'm working on has over 400 tables in it. I have successfully captured and migrated all the tables into the resp. models and have generated and created the DDL for the converted model. However, when I request the data move scripts to be generated I'm only getting ctl files created for the 1st 49 tables. Also, there is no oracle_ctl.sh script created. Also, no post_load.sql script is produced only a pre_load.sql script.
    I've got 3 databases to migrate and on the 2nd database I only get the data move scripts created for ths first 86 tables and there are over 250 tables in it.
    It appears to have worked better for the 3rd database which is much much smaller than the 1st two databases having only 59 tables in it. This time all the files were produced as expected. However, it's really the 1st two larger databases that are my priority to get migrated.
    I've tried changing the preferences within Migration/Generation Options to from 'One Single File' to 'A File per Object' but it makes no difference. I would prefer everything in one file but can work round that.
    Ideally, I'd like to generate all the ctl files for a database in one go so that I can group edit them and would prefer the tool to create the oracle_ctl.sh script to call all the ctl scripts for me rather than having to hand build it. I'm puzzled as to why the tool only creates ctl files for some of the tables contained within a converted model. it looks like it is not completing the job in these cases as it also doesn't create all the scripts that it is supposed to create either. It doesn't give out any error messages and the screen looks no different at completion to when it works successfully in the case of the very small database.
    Anybody had this problem or can suggest how to fix it ?
    Thanks all.

    Send me you phone number to [email protected]
    We'll help sort this out.
    Barry

  • Error -50 after failed data move to samsung hybrid internal drive.

    The original data move front external WD 3tb USB drive fails with error -36.  Then attempting to write any data to that drive dives a -50 error. Only cure I have found is to re-partition the drive. Fixing permissions, verifying drive, repairing drive, deleting .DS files,, etc. Nothing works but a partition recreation. Any clues? 2009 MBP, 6gig ram, 256 g Samsung 840 pro SSD primary drive. 1tb Samsung hybrid as secondary drive.

    That particular error happens on occasion after switching a TM drive from one Mac to another, erasing your TM disk/partition, or attaching a new TM drive with the same name as an old one.
    You can usually fix this by simply re-selecting your TM drive in TM Preferences > Change Disk.
    Since your backups are now working, it's likely ok.

  • Can you set a default value using Data Mover Scripts?

    Hi,
    We're going through a upgrade of PS Fin 8.9 -> 9.1. We've found some existing tables with new columns that are marked as NOT NULL so we're having trouble migrating the data using DMS.
    Is there a way to Export the data from 8.9 and when importing into 9.1, set a default value for the new column using DMS?
    A specific example would be the PS_SOURCE_TBL where EXCHANGE_RATE_OPTN is new and NOT NULL.
    Thanks for any assistance.

    What you could do is first rename the record to be imported
    IMPORT SOURCE_TBL AS PS_SOURCE_TBL_ORG;
    Then write a insert/select script from PS_SOURCE_TBL_ORG to PS_SOURCE_TBL defaulting column EXCHANGE_RATE_OPTN
    Insert into PS_SOURCE_TBL (COLUMN,COLUMN,COLUMN,COLUMN, EXCHANGE_RATE_OPTN)
    select PS_SOURCE_TBL_ORG (COLUMN,COLUMN,COLUMN,COLUMN, 'defaultvalue');
    And then drop table PS_SOURCE_TBL_ORG to clean up your database.

  • Error while running the root.sh script during Grid installation on a pre-installed 11g database .

    Hi Oracle Experts,
    I am trying to setup a new GRID Standalone Infrastructure on a previously installed Oracle 11g database.
    It runs all fine but when prompts to run the root.sh script it does not allow to proceed as it prompts to overwrite the existing path for /usr/bin/local
    well, I google'd and tried with overwrite : Y . It prompted to run the script but it failed ...
    Could you please help me on this ..
    [root@asm ~]# /u01/app/11.2.0/grid/root.sh
    Running Oracle 11g root.sh script...
    The following environment variables are set as:
        ORACLE_OWNER= oracle
        ORACLE_HOME=  /u01/app/11.2.0/grid
    Enter the full pathname of the local bin directory: [/usr/local/bin]:
    The file "dbhome" already exists in /usr/local/bin.  Overwrite it? (y/n)
    [n]: y
       Copying dbhome to /usr/local/bin ...
    The file "oraenv" already exists in /usr/local/bin.  Overwrite it? (y/n)
    [n]: y
       Copying oraenv to /usr/local/bin ...
    The file "coraenv" already exists in /usr/local/bin.  Overwrite it? (y/n)
    [n]: y
       Copying coraenv to /usr/local/bin ...
    Entries will be added to the /etc/oratab file as needed by
    Database Configuration Assistant when a database is created
    Finished running generic part of root.sh script.
    Now product-specific root actions will be performed.
    To configure Grid Infrastructure for a Stand-Alone Server run the following command as the root user:
    /u01/app/11.2.0/grid/perl/bin/perl -I/u01/app/11.2.0/grid/perl/lib -I/u01/app/11.2.0/grid/crs/install /u01/app/11.2.0/grid/crs/install/roothas.pl
    [root@asm ~]# /u01/app/11.2.0/grid/perl/bin/perl -I/u01/app/11.2.0/grid/perl/lib -I/u01/app/11.2.0/grid/crs/install /u01/app/11.2.0/grid/                              crs/install/roothas.pl
    2015-03-18 01:42:25: Checking for super user privileges
    2015-03-18 01:42:25: User has super user privileges
    2015-03-18 01:42:25: Parsing the host name
    Using configuration parameter file: /u01/app/11.2.0/grid/crs/install/crsconfig_params
    Creating trace directory
    LOCAL ADD MODE
    Creating OCR keys for user 'oracle', privgrp 'oinstall'..
    Operation successful.
    CRS-4664: Node asm successfully pinned.
    Adding daemon to inittab
    CRS-4123: Oracle High Availability Services has been started.
    ohasd is starting
    acfsroot: ACFS-9320: Missing advmutil.
    acfsroot: ACFS-9320: Missing advmutil.bin.
    acfsroot: ACFS-9320: Missing fsck.acfs.
    acfsroot: ACFS-9320: Missing fsck.acfs.bin.
    acfsroot: ACFS-9320: Missing mkfs.acfs.
    acfsroot: ACFS-9320: Missing mkfs.acfs.bin.
    acfsroot: ACFS-9320: Missing mount.acfs.
    acfsroot: ACFS-9320: Missing mount.acfs.bin.
    acfsroot: ACFS-9320: Missing acfsdbg.
    acfsroot: ACFS-9320: Missing acfsdbg.bin.
    acfsroot: ACFS-9320: Missing acfsutil.
    acfsroot: ACFS-9320: Missing acfsutil.bin.
    acfsroot: ACFS-9301: ADVM/ACFS installation can not proceed:
    acfsroot: ACFS-9302: No installation files found at /u01/app/11.2.0/grid/install/usm/EL5/x86_64/2.6.18-8/2.6.18-8.el5xen-x86_64/bin.
    asm     2015/03/18 01:43:06     /u01/app/11.2.0/grid/cdata/asm/backup_20150318_014306.olr
    Successfully configured Oracle Grid Infrastructure for a Standalone Server
    when I checked for ASM instance its not running ... but just ohasd service and nothing else ..
    [root@asm grid]# ps -ef | grep pmon
    oracle    5831     1  0 01:15 ?        00:00:01 ora_pmon_db11g1
    root     12625  8794  0 02:30 pts/2    00:00:00 grep pmon
    [root@asm grid]#
    [root@asm grid]#
    [root@asm grid]# ps -ef | grep d.bin
    oracle   12643     1  5 02:30 ?        00:00:00 /u01/app/11.2.0/grid/bin/ohasd.bin reboot
    root     12715  8794  0 02:30 pts/2    00:00:00 grep d.bin
    [root@asm grid]#
    Could you please help .

    hi,
    The issue not with /usr/bin/local.  When you excute root.sh, it will try configure the ASM with New GRID HOME. Issue started, whent tried to start
    CRS-4123: Oracle High Availability Services has been started.
    ohasd is starting <<=================================================================
    acfsroot: ACFS-9320: Missing advmutil.
    Please let us know the below details
    ==> Acfs is configured the servers??
         ==>acfsutil registry
                   acfsutil info fs output.
    ==> With out ASM instance , How the database, CRS STarted ???
    ==> Please try stop and start the crs.
    ==> crsctl query crs activeversion output
    Regards
    Krishnan

  • Error: Could not continue scan with nolock due to data movement, DBCC proccache will clear the probelm

    SQL Server: 2008 R2 SP2
    Before describing my problem, I have gone via the forum, there is no view or functions inside my stored procedure
    When running a particular stored procedure inside crystal report, the error " Could not continue scan with nolock due to data movement" comes once every few weeks. After I clear the query cache plan, it works again for few weeks and the problem
    comes again. During these few weeks, there is no restart or query plan clearing.
    If I run the stored procedure inside SSMS, where the SQL statement is copied and pasted from SQL profiler during crystal report run, there is no error.
    I discovered running in SSMS and crystal report generate 2 different query plans even I copied the SQL from SQL profiler, I have actually saved the query plans. Unfortunately, this forum does not accept attachments, or otherwise I will post my query plans
    here.
    There is one thing I notice about the query plan is during nested loop operation, there is a warning "no join predicate". I don't use any views or UDF in the statement, nor did I use pre-1992 ANSI join syntax. However, I did use table variables.
    My guess is whether this will cause " Could not continue scan with nolock due to data movement", after I clear the cache, I run crystal report again, and I look at the plan again, the "nested loop no join predicate" warning is gone.
    Running this stored procedure took 1 second maximum, even when this error is popping up, it pop up within 1 second.
    DBCC checkdb has been run
    The same stored procedure running by crystal report in a SQL 2008 (non r2) live environment has no problems, so I am thinking this is R2 specific problems.
    The "nested loop no join predicate" error SQL statment is below, no views, no udf, but table variables
    INSERT @ChequeAccount
    SELECT        PS.PaySummaryID, PS.EmployeeID, PS.CostCentreID,
                (PS.GrossPay    + PS.LumpSumA + PS.LumpSumB    + PS.LumpSumD+ PS.LumpSumE+ PS.ETP+ PS.PaymentsAfterTax    - PS.DeductionsAfterTax  
     - PS.Tax- PS.ETPTax    + PS.TaxRebate) * -1 AS Amount,
                CGLM.GLAccountID
    FROM Pay_Summary PS JOIN Input_Sheet ISH ON PS.InputSheetID = ISH.InputSheetID  AND  ISH.PayrollID = @binPayrollID   
    AND PS.PaySummaryID NOT IN (SELECT PaySummaryID FROM @ChequeAccount)
    JOIN Payroll P ON P.PayrollID = ISH.PayrollID AND P.EmployerID = @binEmployerID
    JOIN CustomGLFixMapping CGLM ON CGLM.EmployerID = P.EmployerID AND CustomGLFixMappingNameID = 1 AND CGLM.CostCentreID IS NULL

    The error Could not continue scan with nolock due to data movement can occur when you use the NOLOCK table hint, or use the command SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED. That is, so-called dirty reads. The error is not related to the
    query plan per se, but when scanning a table, the storage engine will use an IAM scan rather than following the clustered index. If there is simultaneous activity, the storage engine may detect this and abort the operation to avoid returning incorrect data.
    Or it may not detect it, and return uncommitted data or fail to return committed data.
    All of these effects are transitory and they will not show up when you are alone on the system, only when there is concurrent activity in one or more of the tables in the query.
    Using dirty reads is a risky business for the reasons explained above, and it takes careful analysis to understand whether you can live with the errors you can get from a particular query. The error about data movement can be handled: trap the error and
    resubmit the query. But what about spurious incorrect results?
    If you believe locking to be a problem, you should consider setting the database to READ_COMMITTED_SNAPSHOT
    and take out all use of READ UNCOMMITTED/NOLOCK. When the database is in READ_COMMITTED_SNAPSHOT, readers read from the snapshot and only see committed data without blocking writers. This has some other effects like requiring a bigger tempdb,
    and there is a risk for other types of concurrency errors, but they tend to be smaller risks.
    I discovered running in SSMS and crystal report generate 2 different query plans even I copied the SQL from SQL profiler,
    This is because SSMS by default runs with SET ARITHABORT ON. I discuss this in more detail in this article on my web site:
    http://www.sommarskog.se/query-plan-mysteries.html
    However, as I said, this problem is not related to the query plan as such, although some query plans are more susceptible to this error than others. (All plans are suscpeitble to produce incorrect results).
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Data Mover login error (SQL 2008R2 as backend DB)

    Hi,
    This question was asked in this thread -> Continued Discussion - Steps to Upgrade PT8.49 to 8.50 Manually
    The difference between problem in that thread and mine is that I have WIN08R2 SQL08R2 Peopletools9.52 and Peoplesoft HRMS 9.0 installed. I was on the steps of creating the databases manually with the sql scripts and that was also preformed.
    I also have different ConnectID: people and AccessID: peoplesa with respective passwords. I try login in with the AccessID and I get the below error.
    File: SQL Access ManagerSQL error. Stmt #: 2 Error Position: 0 Return: 8077 - Invalid SNAC client version (minimum is 10.00.1763) (SQLSTATE PS077)
    Any suggestions?
    -Vishal

    Hi there,
    I upgraded the SNAC client to 1033\x64\sqlncli.msi from the Microsoft and still I get the same Login error in the DataMover.
    I checked the registry location --> HKEY_Local_Machine\Software\Microsoft\Microsoft SQL Server Native Client 10.0\Current Version = 10.51.2500.0
    The Data Mover login error says that it requires minimum is 10.00.1763 so i guess the one I have should work fine?
    Any suggestions?
    -Vishal

  • Reflection errors in Field Data Edit Scripting context(Line Item class)

    Hello Experts,
    I have a script that does some validation in the Field Data Edit Scripting context of the "Line Item" class, and I  have "MATERIAL" as my target. when i try throwing an Application Exception in this context I get a reflection error message box instead of the message I have passed to the Application Exception constructor.
    your help will be greatly appreciated.
    kind regards,

    One thing to be aware of is that no matter how you choose to construct your exception in field, field data edit and collection scripts, the attribute is always set to the be the taget field/collection. Have you noticed that? The script designers made things that way. What's going on here is that the exception raised in the Interprer is caught by the Script Manager and rethrown with the script target as the attribute and your raised exception as the exception.
    One other thing I would point out is that scripts set to execute on the Collection Member Lifecycle event tend to be poor performers.  You can get a faster result if you edit whole collection and chain the errors onto one ChainException. I can only speculate as to why, but I have seen major improvements in complex scripts if I iterate the whole collection, versus implementing a collection memeber lifecycle validate event. This is counter-intuitive, but there it is.
    Finally, exceptions raised in Collection Lifecycle Events that interupt the overall save process in the prescence of parent document Lifecycle Validation events can result in partially saved data. I observed this issue a few years back and it may be resolved now. The only member lifecycle event I use is Created, to lock, default, etc.
    So, for your particular problem, you may want to rethink your strategy and see if you can get things to work for you bypassing that reflection issue. If you still can't raise the exception on MATERIAL, maybe you can raise it on another field, because another advantage of this approach is that you have full control to raise any error on any field on the Line Items.

  • Data adapter failed during OnLoad: The UDC file contains errors: '=' is an unexpected token. The expected token is ';'.

    I am working on SharePoint 2010 Infopath form, i have created one secondary connection to SharePoint List view using XML Connection it works fine.
    For the deployment from DEV to PROD i converted this connection to universal data connection (UDC) and tested this before deploying to PROD but i m getting below error.
    An error occurred querying a data source.
    i checked ULS logs and found below details
    Data adapter failed during OnLoad: The UDC file contains errors: '=' is an unexpected token. The expected token is ';'. Line 17, position 108.
    UDC file is below
    <?xml version="1.0" encoding="UTF-8"?>
    <?MicrosoftWindowsSharePointServices ContentTypeID="0x010100B4CBD48E029A4ad8B62CB0E41868F2B0"?>
    <udc:DataSource MajorVersion="2" MinorVersion="0" xmlns:udc="http://schemas.microsoft.com/office/infopath/2006/udc">
    <udc:Name>LastItemID</udc:Name>
    <udc:Description>Format: UDC V2; Connection Type: XmlQuery; Purpose: ReadOnly; Generated by Microsoft InfoPath 2010 on 2014-04-25 at 16:49:31 by DOMAIN\username.</udc:Description>
    <udc:Type MajorVersion="2" MinorVersion="0" Type="XmlQuery">
    <udc:SubType MajorVersion="0" MinorVersion="0" Type=""/>
    </udc:Type>
    <udc:ConnectionInfo Purpose="ReadOnly" AltDataSource="">
    <udc:WsdlUrl/>
    <udc:SelectCommand>
    <udc:ListId/>
    <udc:WebUrl/>
    <udc:ConnectionString/>
    <udc:ServiceUrl UseFormsServiceProxy="false"/>
    <udc:SoapAction/>
    <udc:Query>https://contoso/_vti_bin/owssvr.dll?Cmd=Display&List={32364DED-7FE3-4276-837C-F2AC62C04B81}&View={804CC528-34B2-4473-89DB-C4E766CACC95}&XMLDATA=TRUE&NOREDIRECT=TRUE</udc:Query>
    </udc:SelectCommand>
    <udc:UpdateCommand>
    <udc:ServiceUrl UseFormsServiceProxy="false"/>
    <udc:SoapAction/>
    <udc:Submit/>
    <udc:FileName>Specify a filename or formula</udc:FileName>
    <udc:FolderName AllowOverwrite=""/>
    </udc:UpdateCommand>
    <!--udc:Authentication><udc:SSO AppId='' CredentialType='' /></udc:Authentication-->
    </udc:ConnectionInfo>
    </udc:DataSource>
    w: sandippatilblog.blogspot.com/

    Hi  Sandip ,
    How about escape “&” to “&amp;” as below:
    <udc:Query>https://contoso/_vti_bin/owssvr.dll?Cmd=Display&amp;List={32364DED-7FE3-4276-837C-F2AC62C04B81}&amp;View={804CC528-34B2-4473-89DB-C4E766CACC95}&amp;XMLDATA=TRUE&amp;NOREDIRECT=TRUE</udc:Query>
    Reference:
    http://social.technet.microsoft.com/Forums/en-US/534fae6b-2cef-4947-86e2-4869cb291cfe/the-form-cannot-be-opened-because-it-requires-the-domain-permission-level-log-id-5566-error?forum=sharepointcustomizationlegacy 
    http://stackoverflow.com/questions/3493405/do-i-really-need-to-encode-as-amp
    Best Regards,
    Eric
    Eric Tao
    TechNet Community Support

  • I have a an iMac 27" and am trying to import some videos of a friends wedding into iMovie however one of the movies won't import. It doesn't say why or give any error message or codes. All of the other movies on the card download with out a problem.

    I have an iMac 27" and am trying to import some videos of a friends wedding into iMovie however one of the movies won't import. It doesn't say why or give any error message or codes.
    All of the other movies on the card download with out a problem. The movie in question is not 'corrupt' as you can watch it in iMovie direct from the SD card but as soon as you try to import it, it  just says 'error'. iIve tried moving the file to an external drive ( and other variations on this theme) then importing but have had no luck.
    Can anyone please help me.

    The mystery remains....
    Thanks for the pointers. The file type is .mts (a proprietry sony one).
    I have now found some video converter software (Wondershare and iSkysoft) at a cost. Either will convert this file for me into .mp4. This I can then import into iMovie without any problems. I've checked this on the trial versions and it worked well but without paying am left with a giant watermark in the video
    The mystery (which I still havent solved) is why did 20 other .mts files import fine and then this one not?
    If you could point me in the direction of some free .mts converter software that would be the cherry on the cake.
    Thanks

  • SQLDeveloper tool script execution aborts with Error report: No more data to read from socket

    Hello,
    Strange behaviour of  the SQLdeveloper tool while executing script with typical DDLs like:
    Create Table,
    Alter Table
    Create Trigger ( use of :new and : old attributes in tehe body of trigger ).
    Insert Into....
    Scripts works ok from time to time.
    But sometimes coincidentally aborts with the error :
    Error report:
    No more data to read from socket
    Do not understand where is the problem.
    Scripts works ok when executed in SQL*Plus on server ( where Oracle RDBMS resides ).
    The version of SQLDeveloper is
    Version 3.2.20.09 Build MAIN-09.87
    The version of RDBMS is  11.2.0.2.0 .
    Thanx for any reference or direction or hint for upgrade or experience.
    Greetings,

    Welcome to the forum!
    Please provide the 4 digit Oracle version (result of SELECT * FROM V$VERSION) for the source and target servers; 10g and 11g are not versions. You also mention sql developer so what is the exact version you are using?
    >
    If delete the "OF out_a.ALLOCATION_ID" of the for update clause of CURSOR exist_allocation, this prolbem will not happen, and the code is comple succesfully on sql developer for oracle 10g.
    >
    Please clarify what works and what doesn't work because your statements are both incorrect and misleading.
    You can't delete the "OF out_a.ALLOCATION_ID" of the for update clause or you would get a syntax error by leaving FOR UPDATE OF with nothing specified after it.
    Also you original statement said
    >
    But when compile a package which is fine on oracle 10g
    >
    But now you say that if you delete the "OF..." the problem doesn't happen and the code compiles on 10g.
    Does the original code compile on 10g or not? Does it compile on 11g or not? After the original code is migrated to 11g does it compile? That is, the code is there can you manually compile it?

  • Data movement of Chart of Account from one instance to other

    We are doing data movement of Chart of account tables(FND_FLEX tables) from one instance(oracle 10.7)to another instance(Oracle 11.0.3).
    Some of the columns of flex value tables are obsoleted in Oracle 11.0.3. like description in fnd_flex_values is obsoleted and description is used from fnd_flex_values_tl.
    My question is, Is it necessary to do the data movement of fnd_flex_values_tl table to get the description or just the data movement for fnd_flex_values table will serve the purpose.
    Thanks in advance.

    Hi
    When upgrading you need to populate both the fnd_flex_values and the fnd_flex_value_tl table. Please note applications will not function properly when you dont populate this table.
    I am not sure how you are populating this table, if using the oracle upgrade process then Oracle takes care of this else please ensure that you populate all the _tl tables
    Thanks
    Bharat

  • Data error during Netweaver 2004s installation

    Hi,
    I am trying to install SAP Netweaver 2004s on Windows x64 platform.
    I installed the J2SE 1.4.2_11-x86. At installation step "Import Java Dump"  of Netwearver 2004s I get the error message:
    ++++
    ERROR 2006-04-20 14:26:15
    CJS-30049  Execution of JLoad tool 'C:\j2sdk1.4.2_11-x64\bin\java.exe -classpath C:\PROGRA1\SAPINS1\SOLMAN\SYSTEM\MSS\CENTRAL\AS\install\sharedlib\launcher.jar -showversion -Xmx512m com.sap.engine.offline.OfflineToolStart com.sap.inst.jload.Jload C:/PROGRA1/SAPINS1/SOLMAN/SYSTEM/MSS/CENTRAL/AS/install/lib/iaik_jce.jar;C:/PROGRA1/SAPINS1/SOLMAN/SYSTEM/MSS/CENTRAL/AS/install/sharedlib/jload.jar;C:/PROGRA1/SAPINS1/SOLMAN/SYSTEM/MSS/CENTRAL/AS/install/sharedlib/antlr.jar;C:/PROGRA1/SAPINS1/SOLMAN/SYSTEM/MSS/CENTRAL/AS/install/sharedlib/exception.jar;C:/PROGRA1/SAPINS1/SOLMAN/SYSTEM/MSS/CENTRAL/AS/install/sharedlib/jddi.jar;C:/PROGRA1/SAPINS1/SOLMAN/SYSTEM/MSS/CENTRAL/AS/install/sharedlib/logging.jar;C:/PROGRA1/SAPINS1/SOLMAN/SYSTEM/MSS/CENTRAL/AS/install/sharedlib/offlineconfiguration.jar;C:/PROGRA1/SAPINS1/SOLMAN/SYSTEM/MSS/CENTRAL/AS/install/sharedlib/opensqlsta.jar;C:/PROGRA1/SAPINS1/SOLMAN/SYSTEM/MSS/CENTRAL/AS/install/sharedlib/tc_sec_secstorefs.jar;D:/usr/sap/S01/SYS/exe/uc/NTAMD64/mssjdbc/base.jar;D:/usr/sap/S01/SYS/exe/uc/NTAMD64/mssjdbc/util.jar;D:/usr/sap/S01/SYS/exe/uc/NTAMD64/mssjdbc/sqlserver.jar;D:/usr/sap/S01/SYS/exe/uc/NTAMD64/mssjdbc/spy.jar -sec S01,jdbc/pool/S01,
    yar15/sapmnt/S01/SYS/global/security/data/SecStore.properties,
    yar15/sapmnt/S01/SYS/global/security/data/SecStore.key -dataDir E:/JAVA_EXPORT\JDMP -job C:\PROGRA1\SAPINS1\SOLMAN\SYSTEM\MSS\CENTRAL\AS\IMPORT.XML -log jload.log' aborts with return code 1.<br>SOLUTION: Check 'jload.log' and 'C:/PROGRA1/SAPINS1/SOLMAN/SYSTEM/MSS/CENTRAL/AS/jload.java.log' for more information.
    ++++
    In the jload.log file the following error is displayed:
    +++++
    Apr 20, 2006 2:22:28 PM com.sap.inst.jload.Jload logStackTrace
    SEVERE: java.io.IOException: Data error (cyclic redundancy check)
    at java.io.FileInputStream.readBytes(Native Method)
    at java.io.FileInputStream.read(FileInputStream.java:177)
    at com.sap.inst.jload.io.SplitInputStream.fillBuffer(SplitInputStream.java:119)
    at com.sap.inst.jload.io.SplitInputStream.read(SplitInputStream.java:327)
    at com.sap.inst.jload.io.GZIPInputSentinel.fillBuffer(GZIPInputSentinel.java:93)
    at com.sap.inst.jload.io.GZIPInputSentinel.read(GZIPInputSentinel.java:160)
    at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:213)
    at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:134)
    at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:87)
    at com.sap.inst.jload.io.ChoppedInputStream.read(ChoppedInputStream.java:130)
    at java.io.BufferedInputStream.read1(BufferedInputStream.java:220)
    at java.io.BufferedInputStream.read(BufferedInputStream.java:280)
    at java.io.DataInputStream.read(DataInputStream.java:170)
    at com.sap.inst.jload.db.BlobHandler.setValue(BitHandler.java:147)
    at com.sap.inst.jload.db.DBTable.load(DBTable.java:274)
    at com.sap.inst.jload.Jload.dbImport(Jload.java:323)
    at com.sap.inst.jload.Jload.executeJob(Jload.java:397)
    at com.sap.inst.jload.Jload.main(Jload.java:621)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:324)
    at com.sap.engine.offline.OfflineToolStart.main(OfflineToolStart.java:81)
    Apr 20, 2006 2:22:30 PM com.sap.inst.jload.db.DBConnection disconnect
    INFO: disconnected
    +++++
    Does anyone knows what is causing this problem?
    Best regards,
    Twan Janssen

    Hi Twan
    Where did you get the j2sdk1.4.2_11 for x64? I'm now starting a NW04s installation on x64 and SAP note 941595 presents a link to the SUN web site, but the link is not correct and results in an error. So I can not go on with the installation since I don't have the correct J2SDK.
    Thanks a lot in advance.
    Kind regards,
    Sam

  • ERROR: Error in getting session data: Invalid or Expired Session

    Hi,
    We are using ssrs 2008 r2 and have a report which has links that allow the user to drill down.
    When the link is clicked the same report is run but shows more detail. This works fine but we are now getting the error below for a user. 
    I have increased the time out for the report to 600 seconds.
    Error in getting session data: Invalid or Expired Session: fnxgxhjuveugvd552qfyv3fw
    session!ReportServer_0-6!4b0!03/18/2015-10:51:29:: i INFO: LoadSnapshot: Item with session: fnxgxhjuveugvd552qfyv3fw, reportPath: , userName: doman/user not found in the database
    Any ideas how this can be resolved?

    Hi Nasa1999,
    Based on your error message, If the all the users will sometimes got the same issue, it can be caused by the timeout issue. It happens a lot when rendering big report, and it exceeds the default Session Timeout. You can execute script
    to increase the Session Timeout time. Please refer to following blog:
    Session Timeout during execution. Also increase the value of <Query Timeout> in rsreportserver.config file. This file locate at XX:\Program Files\Microsoft SQL Server\MSRS10.MSSQLSERVER\Reporting Services\ReportServer
    Please also check to set the <legacyImpersonationPolicy> back to true(https://support.microsoft.com/en-us/kb/972328).
    Locate the Aspnet.config file in the following folder:
     %windir%\Microsoft.NET\Framework64\v2.0.50727
    If Microsoft Visual Studio is installed on the computer that is running SQL Server 2005, double-click
    Aspnet.config. If Visual Studio is not installed on the computer, follow these steps:
            a. Right-click Aspnet.config, point to
    Open with, and then click Choose program.
            b. In the Programs list, click
    Notepad, and then click OK.
    Locate the following tag in the <runtime> section of the code.
            <legacyImpersonationPolicy enabled="false"/>
    Change the tag to:
            <legacyImpersonationPolicy enabled="true"/>
    Save the Aspnet.config file.
    Reset IIS.
    Try your report again.
    Similar Thread for your reference:
    Using SharePoint Report Viewer
    Web Part: Error in getting session data: Invalid or Expired Session: xxxxxxxxxxx
    If only the one user or several users will got the issue, the issue can be caused by the permission setting, please reference to below blog:
    http://answers.flyppdevportal.com/categories/sqlserver/sqlreportingservices.aspx?ID=3506231b-9f4d-4f5a-884d-157137c56336
    http://blog.goobol.com/the-permissions-granted-to-user-domainusername-are-insufficient-for-performing-this-operation/
    If your problem still exists, please try to provide more details information as below:
    Did you reporting service installed in native mode or sharepoint mode, If in native mode, please try to provide more error message from the log file:
    C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\LogFiles
    If you still have any problem, please feel free to ask.
    Regards,
    Vicky Liu
    Vicky Liu
    TechNet Community Support

  • 8008 Error when trying to download movies from iTunes

    I've searched Google and the discussion boards but no one seems to have a fix for the 8008 error when trying to download movies in iTunes. I have the latest version of iTunes, my Mac Mini is up to date with all of it's patches, I've tried deleting the movie in the download folder and redownload the movie but I still get nothing. Does anyone have any suggestions for a fix?
    Thanks,

    Morning pettis360,
    Thanks for using Apple Support Communities.
    For troubleshooting on this, take a look at this article:
    iTunes: Advanced iTunes Store troubleshooting
    http://support.apple.com/kb/TS3297
    "Error -50," "-5000," "8003," "8008," or "-42023"
    These alerts occur due to timeouts or conflicts trying to write a file during download.
    If you encounter this issue while while downloading something from the iTunes Store:
    Delete your iTunes Downloads folder, located in:
    Mac OS X:
  ~/Music/iTunes/iTunes Media/Downloads   Note: "iTunes Media" may appear as "iTunes Music. Also, the tilde (~) refers to your Home directory.
    After locating your iTunes Downloads folder:
    Quit iTunes.
    Delete the Downloads folder on your computer.
    Open iTunes.
    Choose Store > Check for Available Downloads.
    Enter your account name and password.
    Hope this helps,
    Mario

Maybe you are looking for

  • How to get new and updated data into LO Excel in Xcelsius

    Dear Experts, I have created dashboard on top of webi report by using Live-Office connection. Latest data of webi report is imported into excel and mapped data with components and generated SWF file and exported into server. To day my webi report has

  • Query:  Not getting the desired result. Please help

    QL> select * from ecev; entity_Tag 7 8 9 10 1 2 3 4 5 6 SQL> select * from dg; DIGP_TAG CEPT_TAG 600 1 700 1 800 1 900 1 100 1 200 1 500 1 75 2 SQL> select * from enow; ENTITY_CODE ENTITY_TAG DIGP_TAG TARGET_DA CEPT_TAG EXT 7           600      26-MA

  • [bug?] playback issues with QT browser plugin, any workarounds?

    I've noticed a playback issue with media files using the quicktime plugin (currently v7.1.3) with Firefox 1.5.0.x and now Firefox 2.0 on Windows XP (SP2): Description: When I open a media file in a separate tab or window, audio playback cuts out when

  • Access external drive link sys wrt610n via iPad

    Hi I was wondering can I access my external drive that is connected to my linksys wrt610n using my iPad if so what are the steps Thanks

  • NTP server error

    Hi All I have problem in NTP server I configured my Domain server as ntp server when add the ntp server in the cucm 7 Pub by ip of my domain server (192.168.1.7)  the status showing  "The NTP service not accessible". I login the ssh to the call mamag