Excel Throws an error for large data dumps

Hi,
I have a report in which displays 10,00,000 records , for that i created an agent and run the report the data is loaded into excel when we open the excel it is throwing an error
I so mimimum i need to display 5,00,000 records
Thanks,

Hi,
you can filter out some condition on report wise as well as dashboard wise using a prompts.
Example:
If u have Period table in the report filter out for year=2012 only
or
If u have prompt in dashboard filter out.
in agent also we have the conditions rule while downloading the report.
Check this makshu.blogspot.com\increase-row-limits-in-table-properties.html
Regards
VG

Similar Messages

  • Excel shows an error for large data dumps

    Hi,
    I have a report in which displays 10,00,000 records , for that i created an agent and run the report the data is loaded into excel when we open the excel it is throwing an error
    I so mimimum i need to display 5,00,000 records
    Error is A DDE error has occured, and a description of the error cannot display becuase it is too long. If the file name or path is long, try renaming the file or copying it into a diffrent folder
    Thanks,

    Hi,
    you can filter out some condition on report wise as well as dashboard wise using a prompts.
    Example:
    If u have Period table in the report filter out for year=2012 only
    or
    If u have prompt in dashboard filter out.
    in agent also we have the conditions rule while downloading the report.
    Check this makshu.blogspot.com\increase-row-limits-in-table-properties.html
    Regards
    VG

  • INPUT: KT4/V vs CRC error in large data transfer/CD burning HERE!

    This issue can be solved with BIOS update. KT4V & KT4 Ultra users who are having this problem can request for the TEST BIOS to test on your system. You may either pm/email me or Bas, or get it at http://ftp://ftp.heppen.be/MSI/
    Please report back whether the test BIOS would really fix the problem, or cause any new problem, or any performance hit.
    ** this sounds like a Christmas Gift to KT4V users AND New Year Gift to KT4 Ultra users!!!  :D  **
    To all KT4 Ultra and KT4V users, either you have data corruption or CRC error in large data transfer and CD burning or not, your inputs are needed.
    Please list down your system specs as details as possible. Below here is a guideline, you may take this, CTRL-C (copy) and write your specs in your post.
    1. System specs:
    CPU:
    Motherboard:
    RAM Slot-1: (exact brand and model)
    RAM Slot-2:
    RAM Slot-3:
    display card:  [no overclock]
    IDE-1M:  (exact HDD brand and model pls)
    IDE-1S:
    IDE-2M:
    IDE-2S:
    IDE-3:
    SER-1:
    SER-2:
    PCI-1:
    PCI-2:
    PCI-3:
    PCI-4:
    PCI-5:
    PCI-6:
    PSU: (brand, model, total power, (estimated) combined power)
    BIOS revision:
    Operating System:
    VIA 4-in-1 drivers : (if you installed it, tell us the version)
    other drivers, services or applications might affect the data transfer such as : PCI Latency patch, WPCREDIT modifications, VCool, CoolerXP...
    2. CRC ERROR?
    PASS or FAIL
    If PASS, let us know your BIOS settings.
    If FAIL, proceed as below:
    3. Please use these BIOS settings:
    1. Load BIOS Setup Default
    2. NO OVERCLOCK ON FSB! Set accordingly to your CPU
    2. Set RAM to
    a) SPD, if failed try
    b) User defined to the slowest RAM timings, ie 266,2.5,3,6,3,disable interleave,4,disable 1T, normal
    If PASS, go for more extreme BIOS settings as you usually use:
    1. High Performance Default
    2. set RAM to the extreme timings
    3. DO NOT overclock yet until both 1. and 2. are PASS
    4. Try these suggestions:
    1. Microsoft IDE drivers (uninstall VIA 4-in-1's)
    2. VIA 4-in-1 different version's IDE filter driver?
    3. VIA IDE Miniport driver?
    4. use IDE-3 RAID channel for one HDD data transfer
    5. same HDD transfer, ie C:\dir1\*.* -> C:\dir2\*.*
    6. burn CD at 1x speed
    7. Set the HDD and/or CD to PIO mode, or slower UDMA mode.
    8. If and only if you know how to update BIOS correctly and willing to take some risks, try the BETA BIOS KT4 (1.25), KT4V (1.64) too.
    Please report back your tests and experimentations of these suggestions.
    If you have any workaround to deal with this issue other than set back FSB to 100Mhz, please tell us too.
    Thanks for your inputs!

    My system, just gotten this 2 days ago
    CPU: Athlon XP 2000+
    Motherboard: MSI KT4V (MS-6712)
    RAM Slot-1:
    RAM Slot-2: 512 Mb Kingston DDR 333 CAS 2.5
    RAM Slot-3:
    display card: Abit Siluro GF3 Ti200
    IDE-1M: Western Digital WD800JB (8 Mb Cache) - 80 GB
    IDE-1S: Seagate U-Series ST360020A - 60 GB
    IDE-2M: Sony DVD-ROM 16x (DDU1621)
    IDE-2S: Creative CDRW121032
    IDE-3:
    SER-1:
    SER-2:
    PCI-1:
    PCI-2:
    PCI-3: Accton 1207F 10/100 Fast Ethernet Card
    PCI-4: SBLIVE 5.1 Platinum with Live!Drive II
    PCI-5:
    PCI-6:
    PSU: 400Watts (Generic)
    BIOS revision: 1.6
    Operating System: Windows 2000 with SP3
    VIA 4-in-1 drivers : Hyperion 4.45, only AGP and INF installed. IDE drivers are standard Win2k/SP3 ones.
    2. CRC ERROR?
    FAIL.
    When i had my system, i tried installing Windows ME as i wanted to do dual-booting together with Win2K. When i tried to install NVIDIA Detonator drivers (Ver 30.82), it proceeded as normally and asked for a reboot, which i did, then it just hang before the start of Windows ME. I did a reboot, and later selected "Normal" as Windows ME detected a failed startup, and later i was able to enter Windows ME, but it reported that the NVIDIA Detonator drivers were invalid and of wrong type to my display card.
    Later i tried to move my files from my C:\ to D:\ and it reported saying that my destination file is invalid.
    When i changed my OS to Win2k/SP3 (no more dual-boot), and installed the same NVIDIA detonator driver version, it worked. When i started to copy files again, it later BSOD, and said PAGE_FAULT_ERROR (something like this). When installing from CDs, it will report that my .CAB files are corrupt or have insufficient swap file space (i set mine manually at 1.2Gb size). Then there are times during my reboots and entering win2k, i found my keyboard and mouse (all PS/2) not working and windows loads as usual.
    Later i changed my PCI latency setting from 32 to 96, i managed to install from CD without much further issues.
    Upon reading these posts here, i didn't realize that the MSI KT4 series or the KT400 chipset had so much issues! i have read from countless sites like extremetech and anandtech and none reported about this particular errors i have encountered during my first 2 days with this setup. (this is my first Athlon setup, i'm previously and Intel person-type).
    So far, i conclude in my settings is that:
    -32-Bit settings in Bios settings for CD/DVD/CD-RW must be disabled, i concur with Shumway's recommendations.
    -DMA settings in Windows 2000 for CD/DVD/CD-RW must be set to PIO mode otherwise when copying from CD to HDD will have read errors.
    -Installing the PCI latency fix really does wonders for my set. (PCI Latency fix ver 1.9). Now i can copy files from all my drives without worrying so much about CRC errors. Thank you, George E. Breese.
    -I really want to know why in WinME i can't install the NVIDIA detonator drivers, while in Win2K i can.
    I post again, once i have done some more tests to my system, especially CD-R writes.
    Angel17

  • Excel 2012 SP1 plagin for Master Data Services - Bugs ???

    Excel 2012 SP1 plagin for Master Data Services 32bit or 64bit
    1.  
    My entity A has 50,000 records.
    Entity B has a domain attribute "City" from entity A.
    I successfully add new records to the entity B, but after refresh I cannot see any data in the domain attribute "City" for new records (empty cels)!!! Although in the Web client (or through SSMS in table)  I can see the data in "City"
    attribute (e.g. code_095 {Moscow}).
    If I reduce the number of records in the entity A to 20 000, then I see code_095 {Moscow} in Excel 2012 SP1 plagin for MDS.
    Bug???
    2.
    Why Excel 2012 SP1 plagin for MDS resets the format cells ???  Bug???
    3.
    I have created rows for my own headings (business captions) above the MDS-table in the Excel-worksheet.
    Excel 2012 SP1 plagin for MDS: Why after applying the filter (query from server) Excel-worksheet completely recreated ?
    And all my row headers have been removed!!! What the hell ?
    from Moscow with money

    Superbluesman, is this still an issue? This looks like a bug. Did you file a Connect Item?
       Thanks!
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

  • JDBC Adapter throws error for Null Data

    Hi All,
              I am running a interface Proxy to JDBC. But in case if proxy has no new data, the message which goes to the database is:
    <b>  <?xml version="1.0" encoding="utf-8" ?>
    - <ns1:MT_Material xmlns:ns1="urn:sce-com:xi:fi:UnitEstimateData">
    - <Insert_Material>
    - <dbTableName action="INSERT">
      <table>SCEI_UE_MATERIALS</table>
      </dbTableName>
      </Insert_Material>
      </ns1:MT_Material></b>
    and I get the error in RWB as <b>Message processing failed. Cause: com.sap.aii.af.ra.ms.api.RecoverableException: Error processing request in sax parser: Error when executing statement for table/stored proc. 'SCEI_UE_MATERIALS' (structure 'Insert_Material'): java.sql.SQLException: FATAL ERROR document format in structure 'Insert_Material': expected 'access' tag(s) not found</b>.
    Is there a way so that the JDBC adapter doesn't throw any error if there is no access node??
    Pls advice...
    XIer

    Xier,
    Why it will create multiple Insert_Material nodes? Use  like this
    Mandatory segment ->Remove context->Collapse Context ---> Exists -
    >insert_material.
    Try the above and let me know if it doesn't helps!!
    raj.

  • WCF service connection forcibly closed by the remote host for large data

    Hello ,
                        WCF service is used to generate excel report , When the stored procedure returns large data around 30,000 records. Service fails
    to return the data . Below is the mentioned erorr log :
    System.ServiceModel.CommunicationException: An error occurred while receiving the HTTP
    response to <service url> This could be due to the service
     endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being aborted by
    the server (possibly due to the service shutting down). See server logs for more details. ---> System.Net.WebException:
    The underlying connection was closed: An unexpected error occurred on a receive. ---> System.IO.IOException:
    Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
    ---> System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host
       at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags)
       at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       --- End of inner exception stack trace ---
       at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       at System.Net.PooledStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       at System.Net.Connection.SyncRead(HttpWebRequest request, Boolean userRetrievedStream, Boolean probeRead)
       --- End of inner exception stack trace ---
       at System.Net.HttpWebRequest.GetResponse()
       at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout).
       --- End of inner exception stack trace ---
    Server stack trace:
       at System.ServiceModel.Channels.HttpChannelUtilities.ProcessGetResponseWebException(WebException webException, HttpWebRequest request, HttpAbortReason abortReason)
       at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout)
       at System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout)
       at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout)
       at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
       at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
       at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)
    Exception rethrown at [0]:
       at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
       at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
       at IDataSetService.GetMastersData(Int32 tableID, String userID, String action, Int32 maxRecordLimit, Auditor& audit, DataSet& resultSet, Object[] FieldValues)
       at SPARC.UI.Web.Entities.Reports.Framework.Presenters.MasterPresenter.GetDataSet(Int32 masterID, Object[] procParams, Auditor& audit, Int32 maxRecordLimit).
    WEB CONFIG SETTINGS OF SERVICE
    <httpRuntime maxRequestLength="2147483647" executionTimeout="360"/>
    <binding name="BasicHttpBinding_Common"  closeTimeout="10:00:00"  openTimeout="10:00:00"
           receiveTimeout="10:00:00"  sendTimeout="10:00:00"  allowCookies="false"
           bypassProxyOnLocal="false"  hostNameComparisonMode="StrongWildcard"
           maxBufferSize="2147483647"  maxBufferPoolSize="0"  maxReceivedMessageSize="2147483647"
           messageEncoding="Text"  textEncoding="utf-8"   transferMode="Buffered"
           useDefaultWebProxy="true">
    <readerQuotas     maxDepth="2147483647"
          maxStringContentLength="2147483647"  maxArrayLength="2147483647"
          maxBytesPerRead="2147483647"  maxNameTableCharCount="2147483647" />
         <security mode="None"> 
    WEB CONFIG SETTINGS OF CLIENT
    <httpRuntime maxRequestLength="2147483647" requestValidationMode="2.0"/>
    <binding name="BasicHttpBinding_Common"
           closeTimeout="10:00:00"       openTimeout="10:00:00"
           receiveTimeout="10:00:00"       sendTimeout="10:00:00"
            allowCookies="false"        bypassProxyOnLocal="false"
            hostNameComparisonMode="StrongWildcard"        maxBufferSize="2147483647"
            maxBufferPoolSize="2147483647"        maxReceivedMessageSize="2147483647"
            messageEncoding="Text"        textEncoding="utf-8"
            transferMode="Buffered"        useDefaultWebProxy="true">
     <readerQuotas
           maxDepth="2147483647"
           maxStringContentLength="2147483647"
           maxArrayLength="2147483647"
           maxBytesPerRead="2147483647"
           maxNameTableCharCount="2147483647" />   

    Doing binding configuration on a WCF service to override the default settings is not done the sameway it would be done on the client-side config file.
    A custom bindng must be used on the WCF service-side config to override the defualt binding settings on the WCF service-side.
    http://robbincremers.me/2012/01/01/wcf-custom-binding-by-configuration-and-by-binding-standardbindingelement-and-standardbindingcollectionelement/
    Thee readerQuotas and everything else must be given in the Custom Bindings to override any default setttings on the WCF service side.
    Also, you are posting to the wrong forum.
    http://social.msdn.microsoft.com/Forums/vstudio/en-us/home?forum=wcf

  • InfoPackage error for Hierarchy data loads after upgrading to BI 7.31

    Hi All,
    We have recently upgraed to BI 7.31 from BI 7.01. After the upgrade we are observing info package error while loading the Hierarchy for 0BPARTNER infoobject, below is the long text for the error message.
    Datasource name : 0BPARTNER_NAME_HIER
    Error in the hierarchy structure: see error log
    Message no. RH003
    Diagnosis
    The structure of the hierarchy contains errors. You can find a detailed description of the error in the log.
    Procedure
    Correct the error in the hierarchy structure as described in the log.
    Screenshot for the Infopackage Details tab,
    The error message suggests checking the error log, but there are no error logs to be reffered.
    Please let us know if anyone have faced similar issue and help us resolving the same. Also let us know where to find the error log in such scenario.
    We have already tried replicating the data source at BW side, but no success. We also checked the extractor at source system side and found it to be working alright.
    Thanks & Regards,
    Viraj

    Hi Viraj,
    Did you find a solution for your problem ?
    Even am facing the same issue .Process chain is running fine without throwing any errors but if i try to run manually it getting failed.
    when we check the data few of the nodes are not getting reflected which are maintained in source.
    Error Message : Error in the hierarchy structure.
    Source system : Extraction check performed and the record count is matching. We took the dump of data and where able to find the required data but in the maintain hierarchies it is not getting reflected
    Could you please help me in the solution.
    Thank you
    Regards,
    Madhav

  • Bcp doesnt throw an error when the data length exceeds size of the column

    Hi,
    We are using bcp in SQL 2008 R2 to import the data from flat file. When the data length exceeds the size of the column, it doesn't throw any error instead it has ignored the row.
    Please suggest me how to truncate and load the data into table.
    Thanks,
    Pasha

    Hi Pasha,
    According to your description, you want to import the data from flat file to SQL Server table with truncated data in SQL Server 2008 R2. To achieve your requirement, we can use Import and Export wizard. For more details, please refer to the following steps:
    Launch SSMS by clicking SQL Server Management Studio from the Microsoft SQL Server program group.
    Right click on the destination database in the Object Explorer, select Tasks, then Import Data from the context menu to launch the Import Wizard.
    Choose Flat File Source as Data Source, then browser to the flat file.
    Choose SQL Server Native Client 10.0 as Destination, then select the destination database.
    Click Edit Mappings button to change column size or other properties.
    Finish the processing.
    For the example about how to use Import and Export wizard, please refer to the blog below:
    http://www.mssqltips.com/sqlservertutorial/203/simple-way-to-import-data-into-sql-server/
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • XML Solutions for Large Data Sets

    Hi,
    I'm working with a large data set (9 million records comprising 36 gigabytes) and am exploring the use of XML with it.
    I've experimented with a JDBC app (taken straight from Steve Muench's excellent <i>Oracle_XML_Applications</i>) for writing to CLOBS, but achieve throughputs of much less than 40k/s (the minimum speed required to process the data in < 10 days).
    What kind of throughputs are possible loading XML records from CLOBs into multiple tables (using server-side Java apps)?
    Could anyone comment whether XML is a feasible possibility for this size data set?
    Regards,
    Mike

    Just would like to identify myself (I'm the submitter):
    Michael Driscoll <[email protected]>.
    null

  • Endeca : multi invoice pay throwing correct error for internal user but it is failing to throw the same error for external user

    Hi,
    1) Internal User expected exception:
    Exception: Payments,apply credits,disputes and print are not supported when multiple customer/currency transactions are selected
    2) External User is throwing below error instead of throwing above exception.
    Error
      You are trying to access a page that is no longer active.
      The referring page may have come from a previous session. Please select Home
       to proceed.
    found this MACCHECK from fnd logs of external user payment.
    MACCHECK: . Parameter failing validation is :mode. The parameter mode with value MultiPay could not be recognized as part of Server's response on the previous request.  Incoming URL is : /OA_HTML/OA.jsp?page=/oracle/apps/ar/irec/endeca/webui/EndecaDummyPG . Current URL is : /OA_HTML/OA.jsp?page=/oracle/apps/ar/irec/endeca/webui/OIREndecaCustHomePG&akRegionApplicationId=222&_ti=1125493452&oapc=10&retainAM=Y&addBreadCrumb=N&oas=6-LL4ndIUFLX-2zjQAQD6A.. . Referer URL is : https://<hostname>:4443/endeca/web/ar/customer?doAsUserLanguageId=en_US&languageId=en_US . HTTP Request Method is : POST
    can someone please help.
    Thanks,
    RRS

    Well, I compared my classpath between my windows batch file and the
    makefile (that comes with the samples installation) on Solaris and realized
    that I am using different sets of jars.
    So, I removed the extra jars from the makefile to narrow down the
    problem. If I remove the /opt/SUNWam/lib/servlet.jar from the makefile,
    I can reproduce this problem on the Solaris box as well.
    When I include this servlet.jar on my windows machine the program works!
    Only jars I have in my classpath are amclientsdk.jar and servlet.jar which
    I have copied from my installation (/opt/SUNWam/lib) on the Solaris box.
    Just the same way, by copying the am_services.jar, saaj-api.jar, and jaxm-api.jar,
    from the Solarix box to the windows machine,
    I am also able to pull the assertions from the Access Manager.
    I installed Sun Java Enterprise System 2005Q1 on a Solaris 10 machine.
    During the installation, I configured to install the Access Manager
    in Sun Application Server.
    Why do I need to have different set of jars on the windows machine
    for the Access Manager client SDK ?
    Could you please point me to a download link where I could download
    the correct Windows Access Manager Client SDK for
    Sun Java System Access Manager 6.0 (Sun JES 2005Q1)?
    Thanks.

  • Java.lang.OutOfMemoryError: allocLargeObjectOrArray error for large payload

    Our is an outbound flow where one FTP adapter picks the files and it calls a requester service, requester service calls the EBS and EBS calls the provider service, and finally file is getting written using the B2B.
    Since last 4/5 days we are getting java.lang.OutOfMemoryError: allocLargeObjectOrArray.
    We are getting this error when large payloads are being used for doing testing.
    As per our understanding, when you have a tree of composite invocations (so A invokes B invokes C invokes D via flowN 100 times), none of the memory is released until they all complete.
    1. Could you please let us know exactly when memory is released?
    2. How to tune/optimize this?
    Our flow is like:
    SyncDisbursePaymentGetFtpAdapter --> CreateDisbursedPaymentEbizReqABCSImp l--> SyncDisbursePaymentRBTTEBS --> SyncDisbursedPaymentJPMC_CHKProvABCSImpl--> AIAB2BInterface --> Oracle B2B
    <Dec 12, 2012 8:17:06 PM EST> <Warning> <RMI> <BEA-080003> <RuntimeException thrown by rmi server: javax.management.remote.rmi.RMIConnecti\
    onImpl.invoke(Ljavax.management.ObjectName;Ljava.lang.String;Ljava.rmi.MarshalledObject;[Ljava.lang.String;Ljavax.security.auth.Subject;)
    javax.management.RuntimeErrorException: java.lang.OutOfMemoryError: allocLargeObjectOrArray: [B, size 667664.
    javax.management.RuntimeErrorException: java.lang.OutOfMemoryError: allocLargeObjectOrArray: [B, size 667664
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrow(DefaultMBeanServerInterceptor.java:858)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.rethrowMaybeMBeanException(DefaultMBeanServerInterceptor.java:869)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:838)
    at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:761)
    at weblogic.management.jmx.mbeanserver.WLSMBeanServerInterceptorBase$16.run(WLSMBeanServerInterceptorBase.java:449)
    Truncated. see log file for complete stacktrace
    Caused By: java.lang.OutOfMemoryError: allocLargeObjectOrArray: [B, size 667664
    at java.util.Arrays.copyOf(Arrays.java:2786)
    at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:94)
    at java.io.ObjectOutputStream$BlockDataOutputStream.drain(ObjectOutputStream.java:1847)
    at java.io.ObjectOutputStream$BlockDataOutputStream.setBlockDataMode(ObjectOutputStream.java:1756)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1169)
    Truncated. see log file for complete stacktrace
    Do any one have any idea how to rectify this error as whole UAT environment has become down because of this issue.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

    Please find the required info:
    1. Operating System--> LINUX
    2. JVM (Sun or JRockit)-->JRockit
    3. Domain info (production mode enabled?, log levels, number of servers in cluster, number of servers in a machine)-->
    a) production mode enabled-->>Production mode is not enabled, we are going to enable it.
    b)Log levels---There are many logs b2b, soa, bpel, integration, which log level do I need to do finest(32).
    c) number of servers in cluster-->2
    d) number of servers in a machine-->1
    3. Payload info (size, xml/non-xml?,)
    a) size-->more than 1 MB, and upto 25 MB
    b)xml/non-xm--> xml
    we are trying to do the changes as suggested by you and will update accordingly.

  • Variable Substituition throwing an error for empty payload

    Hi All,
    I am using variable substituition in the receiver file adapter...
    Everything is working fine and the variable substituion is working and creating a file whenever the payload in the mapping has the filenode field for variable substituition.
    Now  based on some condition the payload will be empty in the mapping then no file to be created...
    For this i used IGNORE in the receiver adapter BUT still an empty file is being created, this is because the payload will have the filenode field ...
    Now my question is how can we stop in creating an empty file...
    Even i tried using Dynamic Variable in the mapping but that to throws an error...
    Please suggest me on how to solve this...
    Regards,
    sridhar

    >
    sridhar reddy kondam wrote:
    > Now  based on some condition the payload will be empty in the mapping then no file to be created...
    > For this i used IGNORE in the receiver adapter BUT still an empty file is being created, this is because the payload will have the filenode field ...
    Make sure that the node is not created itself in the mapping so that you will not have the issue.
    IS that what you are looking for?
    Else give more details so that we can help

  • Regarding sql function error  for Hijri date to Gregorian date

    Hi ,
    I want to convert Hijri date format into Gregorian date format . i write the script with  sql function  like this
    $Hijri_Date = '16/04/1428';
    $Gregorian_Date = sql('DS_REPO','SELECT CONVERT(DATE,[$Hijri_Date],131)');
    print($Gregorian_Date);
    here $Hijri_Date data type is varchar and $Gregorian_Date data type is date.
    but  I am getting error like
    7868     5812     DBS-070401     10/26/2010 10:37:18 PM     |Session Job_Hijradata_Conversion
    7868     5812     DBS-070401     10/26/2010 10:37:18 PM     ODBC data source <UIPL-LAP-0013\SQLEXPRESS> error message for operation <SQLExecute>: <[Microsoft][SQL Server Native Client
    7868     5812     DBS-070401     10/26/2010 10:37:18 PM     10.0][SQL Server]Explicit conversion from data type int to date is not allowed.>.
    7868     5812     RUN-050304     10/26/2010 10:37:18 PM     |Session Job_Hijradata_Conversion
    7868     5812     RUN-050304     10/26/2010 10:37:18 PM     Function call <sql ( DS_REPO, SELECT CONVERT(DATE,16/04/1428,131) ) > failed, due to error <70401>: <ODBC data source
    7868     5812     RUN-050304     10/26/2010 10:37:18 PM     <UIPL-LAP-0013\SQLEXPRESS> error message for operation <SQLExecute>: <[Microsoft][SQL Server Native Client 10.0][SQL
    7868     5812     RUN-050304     10/26/2010 10:37:18 PM     Server]Explicit conversion from data type int to date is not allowed.>.>.
    7868     5812     RUN-053008     10/26/2010 10:37:18 PM     |Session Job_Hijradata_Conversion
    please help me out to solve this problem .
    Please suggest any other solution to convert hijri date format to gregorian date format.
    Thanks&Regards,
    Ramana.

    Hi ,
    In Data quality there is no inbuild function for converting hijri date to gregorian date .  we have the function for converting julian date to gregorian date.
    Thanks&Regards,
    Ramana.

  • Data Model best Practices for Large Data Models

    We are currently rolling out Hyperion IR 11.1.x and are trying to establish best practces for BQYs and how to display these models to our end users.
    So far, we have created an OCE file that limits the selectable tables to only those that are within the model.
    Then, we created a BQY that brings in the tables to a data model, created metatopics for the main tables and integrated the descriptions via lookups in the meta topics.
    This seems to be ok, however, anytime I try to add items to a query, as soon as i add columns from different tables, the app freezes up, hogs a bunch of memory and then closes itself.
    Obviously, this isnt' acceptable to be given to our end users like this, so i'm asking for suggestions.
    Are there settings I can change to get around this memory sucking issue? Do I need to use a smaller model?
    and in general, how are you all deploying this tool to your users? Our users are accustomed to a pre built data model so they can just click and add the fields they want and hit submit. How do I get close to that ideal with this tool?
    thanks for any help/advice.

    I answered my own question. in the case of the large data model, the tool by default was attempting to calculate every possible join path to get from Table A to Table B (even though there is a direct join between them).
    in the data model options, I changed the join setting to use the join path with the least number of topics. This skipped the extraneous steps and allowed me to proceed as normal.
    hope this helps anyone else who may bump into this issue.

  • ESS error for Personal data and Family members

    Hi,
    I am having an error in ESS For pesonal data and Family members.
    The error message is
    Acritical error has occured.Processing of the service had to be terminated.Unsaved data has been lost.
    PL contact you sytem Admin
    An exception error has occured that was not caught error key RFC_ERROR_SYSTEM FAILURE.
    Thanks
    Sasikanth

    Check the table V_T7XSSPERSUBTYP.
    Also check by the t.code HRUSER if the user have an employee assigned.
    Check the log using the T.Code ST22.
    Please You should do a trace using the t.code ST01
    Hope this help you.
    Regards

Maybe you are looking for

  • Yet again another problem!

    First 'disk cannot be read' That causes me to unplug the ipod, then plug it back in. I've tried, USB hubs, flipping every usb i have in my computer, restoring the ipod, re-installing itunes... I was about 450 songs in with all their artworks all nice

  • Restore a database in recovery mode sql server 2005

    Please i need some help. My database has been marked in recovery in sql 2005 management studio and i can't connect to it how do i bring it back to normal

  • Problem deploying Faces to 10.1.2 OAS

    I have a simple app, that uses EJB 2 and loads up a list from a database table. Then this list is displayed to the screen using a "Data Table" it works fine in Jdev 10.1.3 but when I deploy it to our OAS 10.1.2 I get the following error: Request URI:

  • Mail is automatically adding a BCC to one specific address

    I have one specific email address in my Address Book that is having another email address automatically added. I am not sure if the unintended address is in the CC field or the Bcc field. It is always the same email address added. I have searched the

  • Sparse bundle access

    I have a Mac with an external drive that I use as a backup target for another Mac. Everything works OK with Time Machine and the networked computer but I cannot gain access to the sparse bundle from the Mac that the drive is attached to. The sparse b