Data Integrator Error ASSIGN ... CASTING

I am consistantly getting an ASSIGN ... CASTING error on the second R/3 data flow that runs in any of my SAP jobs.  The first R/3 data flow will run correctly.  I am using Data Integrator version 11.0.2.5.  Has anyone experienced this error when attempting to run R/3 data flows?

Greetings Post Originator,
This post is older than 60 days and there are no entries in the past 30 days.  Based on the content discussed, it appears that you question has been answered. This message is being marked as answered and points are being assigned if available where possible. 
Thank you for being an active participant in the SAP Forums,
Rob Siegele
Forum Moderator
SAP Americas

Similar Messages

  • Data Integrator Web administrator  & Login Failed in Profiler server login

    Post Author: mohideen_km
    CA Forum: Data Integration
    Hi guys
    Help me
    Login Failed in Profiler server login
    I Create Profiler Repository in Repository manager
    While I login in BO DI Designer
    I am getting an error
    While login in Profiler Server Login
    Connection to the Profiler   Error is :  server failed
    close the application saving the login 
    Web administrator Problem
    I have some doubts.Business objects Data Integrator.
    error is : I can't View the Batch jobs history for the repository
    Procedure I followed
    In data Integrator Web administrator
    --while logging as Admin
    --In management Tree left frame of browser
    --Click Status Interval -- Select > Show last Execution of job -- Click
    apply --> Click Home
    after Clicking home --- Click the repository ( I created direpo) i.e
    Batch
    After that I can't view anything any job name.......
    Help me to figure it out
    Mohideen
    Prediktiva Technologies

    Post Author: bhofmans
    CA Forum: Data Integration
    Hi,
    For the profiler problem : you need to add the profiler repository to the web admin too (you can do this in the tree Management/Repositories). Next you need a webadmin user that is linked to this repo, you could use the admin user for that, just check in Management/Users that for this user the profiler repo is attached.
    Once this is set up in the management console, from within the designer, you can connect to the Profiling services using the web administrator's server name and port (e.g. localhost:28080) and the web admin user and password (e.g. admin/admin).
    For your second problem : please check the connection to the repo (I assume you added the direpo already to the web admin) . Again in the tree Management/Repositories go to the local repository connection you created and check the connection (there is a test button). If the connection is correct, you will see the job history (empty if there are no job executions), in the configuration page you will see a list of all jobs and can execute or schedule a job.
    Hope this helps...

  • Finding Error while creating Data Integrator Repositor

    Hi,
    I am working in SAP BO Data Integrator. I have created databeses and its login. Installed SQL Server and while installing Data Integrator I got an error for creating a new repository which is as follows
    Cannot open connection to the repository.  The error message from the underlying DBMS is <ODBC call <SQLDriverConnect> for data source <MSSERVER\SQLEXPRESS> failed: <[Microsoft][ODBC SQL Server Driver][SQL Server]Login failed for user 'AIO_REPO_IDOC'. The user is not associated with a trusted SQL Server connection.>. Notify Customer Support.>. (BODI-20006)
    Can anyone resolve this problem.
    Edited by: sap_beginnner on Aug 9, 2010 4:10 PM

    Hi,
    I used SQL Server Authentication to logon to databases and for DI Version I am using SAP BusinessObjects XI 3.2.
    I tried it again by deleting all databases and then create the same databases but on creating repository for table AIO_REPO_IDOC in repository manager gives error which is as follows
    Cannot open connection to the repository.  The error message from the underlying DBMS is <ODBC call <SQLDriverConnect> for data source <IDHASOFT238933\SQLEXPRESS> failed: <[Microsoft][ODBC SQL Server Driver][SQL Server]Login failed for user 'AIO_REPO_IDOC'. The user is not associated with a trusted SQL Server connection.>. Notify Customer Support.>. (BODI-20006) 
    An error occurred during creation of the local repository. (BODI-300054)
    I also tried through another tool to acces the databases. It is working successfully.
    Edited by: sap_beginnner on Aug 10, 2010 8:58 AM

  • Web template :Error Valid Data provider not assigned

    Hello Friends we have the web template that was working before , we did the upgrade to SP13 from SP11.
    It gives the error Valid Data provider not assigned and
    Bad integer Parameter value of Parameter BLOCK_COLUMNS_SIZE of Object item:ANALYSIS_ITEM:ANALYSIS;
    Please help where do i need to assign the Data provider beacuse the Values are same for Dataprovider as before
         Portal Runtime Error
    An exception occurred while processing your request
    Exception id:
    See the details for the exception ID in the log file.
    Thanks
    Soniya
    Message was edited by:
            soniya kapoor

    Open the template in WAD and do a validation against the server and make sure the template is well formed.
    Also make sure you apply BI JAVA Patch 1 for SPS 13.

  • Error Loading data from oracle to flat file in oracle data integrator

    Hi All,
    I m trying to load data from oracle database to flat file.But I am getting the error at step "Insert new row"(integration)
    Error Message is:
    7000 : null : com.sunopsis.jdbc.driver.file.b.i
    com.sunopsis.jdbc.driver.file.b.i
         at com.sunopsis.jdbc.driver.file.b.d.setDate(d.java)
         at com.sunopsis.sql.SnpsQuery.updateExecStatement(SnpsQuery.java)
         at com.sunopsis.sql.SnpsQuery.addBatch(SnpsQuery.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    I m nt able to find out the exact reason.Can anyone help me in that.

    yes...I got the mistake which i was doing.Actually the target file data type should be string and I was defined it diffrent for each column.
    When I chnged the data type to string,data get loaded to target table.
    Thanks for your reply.

  • Data Integrator - Bulk Load Append ERROR

    Hi,
    Im working in a SYBASE database and i have the following error:
    My source table for example have 1000 rows and i set my target table with the bulk load append option every 100 rows to commit, but every time when the process ends in my target i have 900 rows, i think the data integrator dont commit the last 100 rows.
    I have a "clean" pass of data, no where, no joins is a simple copy from one table to another table.
    Anyone can give me a solution or something to solve this problem?
    Thanks a lot...

    Sybase ASE or IQ and which version ?
    what is the Data Integrator version that you are using ?
    is the job failing with errors ? may be the last batch is not committed because of constraints on the target table, did you try inserting the same set of rows without bulk load ? try inserting the rows without bulk load and see if it gives any errors

  • Referentail Integrity Error when changing Entity Name

    I updated the metadata file to change a Entity name and when I tried to load I received an Integrity error because of some old Journal Entries from 2001-2003. Is there a way around this without modifying the journal entries?
    Metadata referential integrity check started at 14:46:19
    Journals::ACT_APR_TAXADJ     Scenario::ACTUAL     Year::2003     Value::<Parent Curr Adjs>     Period::April          has 2 records of Changed::[ ENTITY::CAMPINAS ]
    Journals::ACT_FEB_TAXADJ     Scenario::ACTUAL     Year::2003     Value::<Parent Curr Adjs>     Period::February          has 2 records of Changed::[ ENTITY::CAMPINAS ]
    Journals::ACT_JAN_TAXADJ     Scenario::ACTUAL     Year::2003     Value::<Parent Curr Adjs>     Period::January          has 2 records of Changed::[ ENTITY::CAMPINAS ]
    Journals::ACT_JUL_TAXADJ     Scenario::ACTUAL     Year::2003     Value::<Parent Curr Adjs>     Period::July          has 2 records of Changed::[ ENTITY::CAMPINAS ]
    Journals::ACT_JUN_TAXADJ     Scenario::ACTUAL     Year::2003     Value::<Parent Curr Adjs>     Period::June          has 8 records of Changed::[ ENTITY::CAMPINAS ]
    Journals::ACT_MAR_TAXADJ     Scenario::ACTUAL     Year::2003     Value::<Parent Curr Adjs>     Period::March          has 8 records of Changed::[ ENTITY::CAMPINAS ]
    Journals::ACT_MAY_TAXADJ     Scenario::ACTUAL     Year::2003     Value::<Parent Curr Adjs>     Period::May          has 2 records of Changed::[ ENTITY::CAMPINAS ]
    Journals::ACTUAL2001APR282     Scenario::ActPara     Year::2001     Value::<Parent Curr Adjs>     Period::April          has 2 records of Changed::[ ENTITY::CAMPINAS ]
    Journals::ACTUAL2001DEC282     Scenario::ActPara     Year::2001     Value::<Parent Curr Adjs>     Period::December          has 5 records of Changed::[ ENTITY::CAMPINAS ]
    Journals::ACTUAL2001FEB282     Scenario::ActPara     Year::2001     Value::<Parent Curr Adjs>     Period::February          has 2 records of Changed::[ ENTITY::CAMPINAS ]
    Journals::ACTUAL2001MAR282     Scenario::ActPara     Year::2001     Value::<Parent Curr Adjs>     Period::March          has 2 records of Changed::[ ENTITY::CAMPINAS ]

    As long as you are only changing the name of a parent you will only be affected by any Parent currency adjustment Journals that were made to that Parent/entity combination. The solution to unpost the Journals, change the parent, and repost the journals will solve the problem for now. You are correct in assuming this is not best practice. Part of the beauty of HFM is to have many organizational structures (hierarchies) and for Journals to follow the entities when you move them or change names (rarely). If you assign a parent currency Journal to a specific parent/entity combination and you have organizational changes down the road you will run into this again. We safeguard against this by having a default hierarchy that contains all Base entities. You need one for each currency group. All our parents are USD, so we only require one group, USD_Journals. Use this as the parent for all Parent currency Journal entires. Then it does not matter which hierarchy the entity shows up in, the journal will follow and the USD_Journals/Entitiy combination will never go away.
    This solution is for Parent entities only. Base entities should not ever change names or default currencies. It creates the mess I was describing earlier and dangerous risk of loss of historical data.
    Edited by: GCAPO on Mar 18, 2013 7:08 AM

  • OSB: Cannot acquire data source error while using JCA DBAdapter in OSB

    Hi All,
    I've entered 'Cannot acquire data source' error while using JCA DBAdapter in OSB.
    Error infor are as follows:
    The invocation resulted in an error: Invoke JCA outbound service failed with application error, exception: com.bea.wli.sb.transports.jca.JCATransportException: oracle.tip.adapter.sa.api.JCABindingException: oracle.tip.adapter.sa.impl.fw.ext.org.collaxa.thirdparty.apache.wsif.WSIFException: servicebus:/WSDL/DBAdapter1/RetrievePersonService [ RetrievePersonService_ptt::RetrievePersonServiceSelect(RetrievePersonServiceSelect_inputParameters,PersonTCollection) ] - WSIF JCA Execute of operation 'RetrievePersonServiceSelect' failed due to: Could not create/access the TopLink Session.
    This session is used to connect to the datastore.
    Caused by Exception [EclipseLink-7060] (Eclipse Persistence Services - 2.0.2.v20100323-r6872): org.eclipse.persistence.exceptions.ValidationException
    Exception Description: Cannot acquire data source [jdbc/soademoDatabase].
    Internal Exception: javax.naming.NameNotFoundException: Unable to resolve 'jdbc.soademoDatabase'. Resolved 'jdbc'; remaining name 'soademoDatabase'.
    ; nested exception is:
    BINDING.JCA-11622
    Could not create/access the TopLink Session.
    This session is used to connect to the datastore.
    JNDI Name for the Database pool: eis/DB/soademoDatabase
    JNDI Name for the Data source: jdbc/soademoDatabase
    I created a basic DBAdapter in JDeveloper, got the xsd file, wsdl file, .jca file and the topLink mapping file imported them into OSB project.
    Then I used the .jca file to generate a business service, and tested, then the error occurs as described above.
    Login info in RetrievePersonService-or-mappings.xml
    <login xsi:type="database-login">
    <platform-class>org.eclipse.persistence.platform.database.oracle.Oracle9Platform</platform-class>
    <user-name></user-name>
    <connection-url></connection-url>
    </login>
    jca file content are as follows:
    <adapter-config name="RetrievePersonService" adapter="Database Adapter" wsdlLocation="RetrievePersonService.wsdl" xmlns="http://platform.integration.oracle/blocks/adapter/fw/metadata">
    <connection-factory location="eis/DB/soademoDatabase" UIConnectionName="Connection1" adapterRef=""/>
    <endpoint-interaction portType="RetrievePersonService_ptt" operation="RetrievePersonServiceSelect">
    <interaction-spec className="oracle.tip.adapter.db.DBReadInteractionSpec">
    <property name="DescriptorName" value="RetrievePersonService.PersonT"/>
    <property name="QueryName" value="RetrievePersonServiceSelect"/>
    <property name="MappingsMetaDataURL" value="RetrievePersonService-or-mappings.xml"/>
    <property name="ReturnSingleResultSet" value="false"/>
    <property name="GetActiveUnitOfWork" value="false"/>
    </interaction-spec>
    </endpoint-interaction>
    </adapter-config>
    RetrievePersonService_db.wsdl are as follows:
    <?xml version="1.0" encoding="UTF-8"?>
    <WL5G3N0:definitions name="RetrievePersonService-concrete" targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/db/KnowledeMgmtSOAApplication/AdapterJDevProject/RetrievePersonService" xmlns:WL5G3N0="http://schemas.xmlsoap.org/wsdl/" xmlns:WL5G3N1="http://xmlns.oracle.com/pcbpel/adapter/db/KnowledeMgmtSOAApplication/AdapterJDevProject/RetrievePersonService" xmlns:WL5G3N2="http://schemas.xmlsoap.org/wsdl/soap/">
    <WL5G3N0:import location="RetrievePersonService.wsdl" namespace="http://xmlns.oracle.com/pcbpel/adapter/db/KnowledeMgmtSOAApplication/AdapterJDevProject/RetrievePersonService"/>
    <WL5G3N0:binding name="RetrievePersonService_ptt-binding" type="WL5G3N1:RetrievePersonService_ptt">
    <WL5G3N2:binding style="document" transport="http://www.bea.com/transport/2007/05/jca"/>
    <WL5G3N0:operation name="RetrievePersonServiceSelect">
    <WL5G3N2:operation soapAction="RetrievePersonServiceSelect"/>
    <WL5G3N0:input>
    <WL5G3N2:body use="literal"/>
    </WL5G3N0:input>
    <WL5G3N0:output>
    <WL5G3N2:body use="literal"/>
    </WL5G3N0:output>
    </WL5G3N0:operation>
    </WL5G3N0:binding>
    <WL5G3N0:service name="RetrievePersonService_ptt-bindingQSService">
    <WL5G3N0:port binding="WL5G3N1:RetrievePersonService_ptt-binding" name="RetrievePersonService_ptt-bindingQSPort">
    <WL5G3N2:address location="jca://eis/DB/soademoDatabase"/>
    </WL5G3N0:port>
    </WL5G3N0:service>
    </WL5G3N0:definitions>
    Any suggestion is appricated .
    Thanks in advance!
    Edited by: user11262117 on Jan 26, 2011 5:28 PM

    Hi Anuj,
    Thanks for your reply!
    I found that the data source is registered on server soa_server1 as follows:
    Binding Name: jdbc.soademoDatabase
    Class: weblogic.jdbc.common.internal.RmiDataSource_1033_WLStub
    Hash Code: 80328036
    toString Results: ClusterableRemoteRef(8348400613458600489S:10.2.1.143:[8001,8001,-1,-1,-1,-1,-1]:base_domain:soa_server1 [8348400613458600489S:10.2.1.143:[8001,8001,-1,-1,-1,-1,-1]:base_domain:soa_server1/291])/291
    Binding Name: jdbc.SOADataSource
    Class: weblogic.jdbc.common.internal.RmiDataSource_1033_WLStub
    Hash Code: 92966755
    toString Results: ClusterableRemoteRef(8348400613458600489S:10.2.1.143:[8001,8001,-1,-1,-1,-1,-1]:base_domain:soa_server1 [8348400613458600489S:10.2.1.143:[8001,8001,-1,-1,-1,-1,-1]:base_domain:soa_server1/285])/285
    I don't know how to determine which server the DBAdapter is targetted to.
    But I found the following information:
    Under Deoloyment->DBAdapter->Monitoring->Outbound Connection Pools
    Outbound Connection Pool Server State Current Connections Created Connections
    eis/DB/SOADemo AdminServer Running 1 1
    eis/DB/SOADemo soa_server1 Running 1 1
    eis/DB/soademoDatabase AdminServer Running 1 1
    eis/DB/soademoDatabase soa_server1 Running 1 1
    The DbAdapter is related to the following files:
    C:\ Oracle\ Middleware\ home_11gR1\ Oracle_SOA1\ soa\ connectors\ DbAdapter. rar
    C:\ Oracle\ Middleware\ home_11gR1\ Oracle_SOA1\ soa\ DBPlan\ Plan. xml
    I unzipped DbAdapter.rar, opened weblogic-ra.xml and found that there's only one data source is registered:
    <?xml version="1.0"?>
    <weblogic-connector xmlns="http://www.bea.com/ns/weblogic/90">
    <enable-global-access-to-classes>true</enable-global-access-to-classes>
    <outbound-resource-adapter>
    <default-connection-properties>
    <pool-params>
    <initial-capacity>1</initial-capacity>
    <max-capacity>1000</max-capacity>
    </pool-params>
    <properties>
    <property>
    <name>usesNativeSequencing</name>
    <value>true</value>
    </property>
    <property>
    <name>sequencePreallocationSize</name>
    <value>50</value>
    </property>
    <property>
    <name>defaultNChar</name>
    <value>false</value>
    </property>
    <property>
    <name>usesBatchWriting</name>
    <value>true</value>
    </property>
    <property>
    <name>usesSkipLocking</name>
    <value>true</value>
    </property>
    </properties>
              </default-connection-properties>
    <connection-definition-group>
    <connection-factory-interface>javax.resource.cci.ConnectionFactory</connection-factory-interface>
    <connection-instance>
    <jndi-name>eis/DB/SOADemo</jndi-name>
              <connection-properties>
                   <properties>
                   <property>
                   <name>xADataSourceName</name>
                   <value>jdbc/SOADataSource</value>
                   </property>
                   <property>
                   <name>dataSourceName</name>
                   <value></value>
                   </property>
                   <property>
                   <name>platformClassName</name>
                   <value>org.eclipse.persistence.platform.database.Oracle10Platform</value>
                   </property>
                   </properties>
              </connection-properties>
    </connection-instance>
    </connection-definition-group>
    </outbound-resource-adapter>
    </weblogic-connector>
    Then I decided to use eis/DB/SOADemo for testing.
    For JDeveloper project, after I deployed to weblogic server, it works fine.
    But for OSB project referencing wsdl, jca and mapping file from JDeveloper project, still got the same error as follows:
    BEA-380001: Invoke JCA outbound service failed with application error, exception:
    com.bea.wli.sb.transports.jca.JCATransportException: oracle.tip.adapter.sa.api.JCABindingException: oracle.tip.adapter.sa.impl.fw.ext.org.collaxa.thirdparty.apache.wsif.WSIFException: servicebus:/WSDL/DBAdapterTest/DBReader [ DBReader_ptt::DBReaderSelect(DBReaderSelect_inputParameters,PersonTCollection) ] - WSIF JCA Execute of operation 'DBReaderSelect' failed due to: Could not create/access the TopLink Session.
    This session is used to connect to the datastore.
    Caused by Exception [EclipseLink-7060] (Eclipse Persistence Services - 2.0.2.v20100323-r6872): org.eclipse.persistence.exceptions.ValidationException
    Exception Description: Cannot acquire data source [jdbc/SOADataSource].
    Internal Exception: javax.naming.NameNotFoundException: Unable to resolve 'jdbc.SOADataSource'. Resolved 'jdbc'; remaining name 'SOADataSource'.
    ; nested exception is:
    BINDING.JCA-11622
    Could not create/access the TopLink Session.
    This session is used to connect to the datastore.
    Caused by Exception [EclipseLink-7060] (Eclipse Persistence Services - 2.0.2.v20100323-r6872): org.eclipse.persistence.exceptions.ValidationException
    Exception Description: Cannot acquire data source [jdbc/SOADataSource].
    Internal Exception: javax.naming.NameNotFoundException: Unable to resolve 'jdbc.SOADataSource'. Resolved 'jdbc'; remaining name 'SOADataSource'.
    You may need to configure the connection settings in the deployment descriptor (i.e. DbAdapter.rar#META-INF/weblogic-ra.xml) and restart the server. This exception is considered not retriable, likely due to a modelling mistake.
    It almost drive me crazy!!:-(
    What's the purpose of 'weblogic-ra.xml' under the folder of 'C:\Oracle\Middleware\home_11gR1\Oracle_OSB1\lib\external\adapters\META-INF'?
    ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

  • Upload Excel sheet using Data Integrator 6.1?

    hi all ,
    Upload Excel sheet using Data Integrator? and how to create ODBC for the PC and jobserver i am using version 6.1? i am using excel as my one of the data source and tell me how to use different types of data sources in DI . after uploading the xl file if i apply any transform on the excel data i will give error like
    Posted: 25 Sep 2008 04:30
    Post subject: Re: Upload Excel sheet using Data Integrator? 
    I am getting the error like
    3128 292 CON-120302 09-25-08 09:59:40 ODBC call <SQLDriverConnect> for data source <sas> failed: <[Microsoft][ODBC Driver Manager] Data source name not found and no
    3128 292 CON-120302 09-25-08 09:59:40 default driver specified>. Notify Customer Support.
    1512 2992 CON-120302 09-25-08 09:59:41 ODBC call <SQLDriverConnect> for data source <sas> failed: <[Microsoft][ODBC Driver Manager] Data source name not found and no
    1512 2992 CON-120302 09-25-08 09:59:41 default driver specified>. Notify Customer Support.
    Please help me out
    Thank u

    Hi Shonti,
    The DI 11.7 installer can be used to upgrade a DI 6.1 local repository (e.g. the upgrade is supported).  This will migrate all jobs and flows.  They will remain intact, however, this is always a major migration effort and should not be taken lightly.  If you do upgrade, please make sure this is a planned effort with rigorous testing and validation.  You should also ensure that you consult the release notes and [supported platforms documentation|https://websmp110.sap-ag.de/~form/sapnet?_SHORTKEY=01100035870000712240&_SCENARIO=01100035870000000202] for the 11.7 package you intend to install.  The DI 11.7 documentation also contains info about how to install and configure the Excel Adapter, and what functionality it provides.
    Thanks,
    ~Scott

  • Date parse error while importing users from OIM to OIA (SRM 5.0.3)

    Hi All,
    Env Details:
    OIA (SRM 5.0.3), Weblogic and Oracle 10g DB
    We have integrated OIM to OIA with extended attributes mapping by modifying iam-context.xml file to load users. Its done successfully. But when we map "Date" related attribute, its giving "Date Parsing error" and its not loading the users.
    We have tried loading users using flatFile mechanism, its also giving same result.
    Please suggest me. Thanks in Advance !!!
    Regards,
    Ravi G.

    Hi,
    Its a problem with OOB's OIMIAMSolution.class file, which is called while importing users from OIM. It used DateParse () conversion method only for all attributes which OIA attributes' name is ends with "Date". It defined, the conversion of date from (yyyy-MM-dd). So its expecting the input value should be in defined format(yyyy-MM-dd), if not, it gives a parse error.
    We found work around for this as follows,
    We have used other related OIA attribute which name ends other than "Date" string.
    Thanks,
    Ravi G.

  • "Schema validation found non-data type errors" error when passing a string value to date field in infopath

    Hi,
    I have an infopath web brower enabled form. In the form i have a date field.
    I am passing the data from the database to that field using the C# code.
    But, as the field from database is coming as string, i am getting an error, and i am not able to assign the value.
    I get the date value from database as "3/25/2011 12:00:00 AM"
    I used the below code:
    [CODE]
    if (objInfopathFormcData.myRecievedDate != null)
      myRoot.SelectSingleNode("/my:myFields/my:field97", NamespaceManager).SetValue(objInfopathFormcData.myRecievedDate);
    [/CODE]
    I am getting the error as "Schema validation found non-data type errors".
    How to set the value for a date field in Infopath.
    Thank you

    HI,
    I fixed it:
    Below code is used to fix:
    [CODE]
    XPathNavigator xfield = null;
    DateTime dtmyRecievedDate;
    dtmyRecievedDate = Convert.ToDateTime(objInfopathFormcData.myRecievedDate);
    if (objFormcData.FcCompletionDate != null)
    xfield = myRoot.SelectSingleNode("/my:myFields/my:field97", NamespaceManager);
    DeleteNil(xfield);
    xfield.SetValue(dtmyRecievedDate.GetDateTimeFormats().GetValue(5).ToString());
    // method to delete xsi:nil
    private void DeleteNil(XPathNavigator nav1)
    if (nav1.MoveToAttribute("nil", "http://www.w3.org/2001/XMLSchema-instance"))
       nav1.DeleteSelf();
    [/CODE]
    Thank you

  • Data access error while running  the script in fdm workbench script editor

    Hello Experts,
    I have been getting the error when i run the script in script editor of fdqm workbench client.this is the error
                              -2147467259 Data access error and it is navigating the error at line 30 about this script "Set rsAppend = DW.DataAccess.farsTable(strWorkTableName)"
    Please revert me asap where i should load the data from peoplesoft to one of my planning database through fdm.
    Thanks in advance...

    Hi,
    Thanks for your suggestion's here i am going to run the datapump script so while i am running in workbench it is throwing data access error.So as you guy's suggested i created one import format and added this script to that import format which is added to one of my location even it is throwing the same error.I am getting the error at this sentence   "strWorkTableName" I want to know what is this where it reflect's?So my question is Is there any difference while running data pump or integration script.Please let me know a detailed navigation steps to run this script's with out data access error to run these kind of scripts.
    Note:Again my target is to load the data to planning database through fdqm which i am doing through sql statement's.
    Suggest your valuable suggestion's
    Thanks in advance...
    I am getting the error at this sentence   "strWorkTableName" I want to know what is this where it reflect's?

  • ORA-01403: no data found error (given by edit link of APEX-made form)

    Hello!
    I am a newbie in apex, so please forgive me if my question is stupid or obvious. I got a bit confused because of a problem, and I really hope that you professionals can give me some clues of where could I begin the elimination of this error.
    I made a report and a form in another page with APEX's tool Add Form/Report and Form. The report shows a full table, and the rows of the table are editable. However, I have rows which I can't edit because APEX gives me the following error:
    ORA-01403: no data found
    Error Unable to fetch row.
    Not all rows do this, but I have a few that do. Of course the report shows them, so I think they must exist. I didn't change anything special on the APEX-made items or processes.
    I thought that the problem could be with my data, but I couldn't see anything weird in that rows.
    Does anyone have an idea of what can this be caused by? I have APEX 2.1.0.00.39.
    Thanks,
    Eszter

    Hi Andy,
    Thank you very much for your time!
    The fields in the form are all right. The fields get filled in perfectly in most of the cases, only those few rows don't :(
    However, now that you wrote of the process of row fetching, I think that maybe I have an idea of what is happening. My table has two primary keys (two fields together make the primary key, I don't know how it's called in English), one of them is a date. (I know that this is quite a bad practice, but, much to my regret, I cannot change it.) Now, this date is in YY-MON-DD format, which is used by my language.
    One of this dates is from 1800's. As my report shows it, the year gets truncated to the last two character. APEX passes this value into the field of the form using varchar2, and when it tries to cast it back to YY-MON-DD format, then it supposes it's from 1900's instead of 1800's. With 19xx however it doesn't find my field.
    Does this sound logical? It seems logical to me, but I am a beginner... :(
    Still, if this is the core of the problem, it's most possibly not the only problem, because I have dates from 19xx which can't identify their rows... But I am suspicious because of these date things. If you have any idea then please let me know.
    Thanks,
    Eszter

  • Oracle Data Integrator 11.1.1.5 Work Schema - List of Privileges

    Hi All,
    Oracle Data Integrator 11.1.1.5.
    Extracting data from Oracle DB for Oracle EBS 12.1.3.
    Customer created read-only schema (XXAPPS) to extract the data from EBS.
    For ODI Work schema we now created one schema 'XBOL_ODI_TEMP' on the source DB. We are now looking for appropriate privileges that needs to be granted to XXAPPS and 'XBOL_ODI_TEMP' so that we won’t face the any error messageS related to permissions when we run ODI scenario?
    We are now facing the error message : ODI-1227: Task SrcSet0 (Loading) fails on the source ORACLE connection VTB_ORACLE_EBS_1213.
    Caused By: java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist.
    Similar previliges can be granted to the work schema on target.
    Venkat

    i think it would be fine with only one schema(user) created at the source system which has got read access on the tables of the EBS DB. Now to resolve this error, assuming XXAPPS user is the one used,
    in the topology --> data server(for EBS) --> physical schema the EBS schema name could be selected for Schema and XXAPPS as the work schema(for all ODI work related objects e.g. CDC)
    Also, in the Data server the user XXAPPS needs to be used which has read access to EBS tables.
    Now everytime ODI generates a query it will access a table lets say DUMMY as ,<EBS Schema>.DUMMY thus the reference is made.
    Alternatively, you can create synonyms for EBS tables in XXAPPS schema.

  • How to call text file using Script in Data Integrator

    Dear All,
    Can any one assit me in how to call a text file using script with the help of Data Integrator.
    and one question ?
    M having 32 csv files i want to club thos 32 csv files into one table with the help of Data Integrator, can
    any one assist me.

    mary,
    since you knew the file name ,when clicked in name send to server,read the file and write to servlet outputstream.
    I think this would help you.
    If anything wrong in mycode ..forums will help you further
    BufferedInputStream bis=null;
    BufferedOutputStream bos=null;
    int bytesRead=0;
    byte buff[]=new byte[1024];
    File f=new File(test.txt);
    try{
         bis= new BufferedInputStream(new FileInputStream(f));
         bytesRead=bis.read(buff,0,buff.length);
         if(bytesRead!=-1){
              // create a BufferedOutputStream from ServletOutputStream
              bos=new BufferedInputStream(response.getOutputStream());
              do{
                   bos.write(buff,0,bytesRead);
              }while((bytesRead=bis.read(buff,0,buff.length))!=-1)
    }catch(Exception e){
         ////error handling
         }

Maybe you are looking for

  • Flashing question mark at start up (details in post)

    Problem: Flashing question mark at startup. System: iBook G4 running OS X version 10.3.something. Details: The machine froze on waking up when I opened it last night. I held down the power key to turn it off, then powered it up again and got a flashi

  • Raw images

    My Photoshop elements 8 won't let me open my raw images. I have done it before but recently had my computer cleaned up. How can I fix this?

  • Contacts and Calendar with iCloud and non-iCloud devices

    I have a mix of non-iCloud and iCloud devices and macs, both at work and at home. Calendar has continued to sync correctly between all devices, however contacts is definitely not syncing between iCloud and non-iCloud devices. Bookmarks are also not c

  • Dynamic creation of the viariable

    Can I in TS dynamically (API) create any variable (including containers and arrays)? Solved! Go to Solution.

  • Cannot open Print Wizard using Smilebox

    My HP Officejet 6310 All-in-One will not print my Smilebox creations.  I am using Windows 7 32-bit.  I have used Smilebox for several years and suddenly (in February 2012) I am unable to print.  I receive the message that tells me Smilebox is unable