Time_out issue (Vast Amount of Data)

I was about to load vast amount data into an Infoobject (MD) its getting Time_out i have increased the Time_out time in InfoPackage from 7hours to 1 day . Now its saying as <b>Not Enough Memory and sourse system error</b>. How can  i increase the size of Infoobject? Is there a way around for this?
Thanks

Do you know about how many records you need to load?  You might be better off creating separate infopackages with different selection criteria so that all your data gets loaded in pieces.
For example, if you have 100,000,000 materials, you may want to divide the extract into several loads.  One to load material#'s between A and B9999999, etc.
Does this help?

Similar Messages

  • Bex Report Designer - Large amount of data issue

    Hi Experts,
    I am trying to execute (on Portal) report made in BEx Report Designer, with about 30 000 pages, and the only thing I am getting is a blank page. Everything works fine at about 3000 pages. Do I need to set something to allow processing such large amount of data?
    Regards
    Vladimir

    Hi Sauro,
    I have not seen this behavior, but it has been a while since I tried to send an input schedule that large. I think the last time was on a BPC NW 7.0 SP06 system and it worked OK. If you are on a recent support package, then you should search for relevant notes (none come to mind for me, but searching yourself is always a good idea) and if you don't find one then you should open a support message with SAP, with very specific instructions for recreating the problem from a clean input-schedule.
    Good luck,
    Ethan

  • Power BI performance issue when load large amount of data from database

    I need to load data set from my database, which have large amount of data, it will take so many time to initialize data before I can build report, is there any good way to process large amount of data for PowerBI? As I know many people analysis data based
    on PowerBI, is there any suggestion for loading large amount of data from database?
    Thanks a lot for help

    Hi Ruixue,
    We have made significant performance improvements to Data Load in the February update for the Power BI Designer:
    http://blogs.msdn.com/b/powerbi/archive/2015/02/19/6-new-updates-for-the-power-bi-preview-february-2015.aspx
    Would you be able to try again and let us know if it's still slow? With the latest improvements, it should take between half and one third of the time that it used to.
    Thanks,
    M.

  • Report in Excel format fails for huge amount of data with headers!!

    Hi All,
    I have developed an oracle report which fetches upto 5000 records.
    The requirements is to fetch upto 100000 records.
    This report fetches data if the headers are removed. If headers are given its not able to fetch the data.
    Have anyone faced this issue??
    Any idea to fetch huge amount of data by oracle report in excel format.
    Thanks & Regards,
    KP.

    Hi Manikant,
    According to your description, the performance is slow when display huge amount of data with more than 3 measures into powerpivot, so you need the hardware requirements for build a PowerPivot to display huge amount of data with more than 3 measures, right?
    PowerPivot benefits from multi-core processors, large memory and storage capacities, and a 64-bit operating system on the client computer.
    Based on my experience, large memory, multiprocessor and even
    solid state drives are benefit PowerPivot performance. Here is a blog about Memory Considerations about PowerPivot for Excel for you reference.
    http://sqlblog.com/blogs/marco_russo/archive/2010/01/26/memory-considerations-about-powerpivot-for-excel.aspx
    Besides, you can identify which query was taking the time by using the tracing, please refer to the link below.
    http://blogs.msdn.com/b/jtarquino/archive/2013/12/27/troubleshooting-slow-queries-in-excel-powerpivot.aspx
    Regards,
    Charlie Liao
    TechNet Community Support

  • Error in Generating reports with large amount of data using OBIR

    Hi all,
    we hve integrated OBIR (Oracle BI Reporting) with OIM (Oracle Identity management) to generate the custom reports. Some of the custom reports contain a large amount of data (approx 80-90K rows with 7-8 columns) and the query of these reports basically use the audit tables and resource form tables primarily. Now when we try to generate the report, it is working fine with HTML where report directly generate on console but the same report when we tried to generate and save in pdf or Excel it gave up with the following error.
    [120509_133712190][][STATEMENT] Generating page [1314]
    [120509_133712193][][STATEMENT] Phase2 time used: 3ms
    [120509_133712193][][STATEMENT] Total time used: 41269ms for processing XSL-FO
    [120509_133712846][oracle.apps.xdo.common.font.FontFactory][STATEMENT] type1.Helvetica closed.
    [120509_133712846][oracle.apps.xdo.common.font.FontFactory][STATEMENT] type1.Times-Roman closed.
    [120509_133712848][][PROCEDURE] FO+Gen time used: 41924 msecs
    [120509_133712848][oracle.apps.xdo.template.FOProcessor][STATEMENT] clearInputs(Object) is called.
    [120509_133712850][oracle.apps.xdo.template.FOProcessor][STATEMENT] clearInputs(Object) done. All inputs are cleared.
    [120509_133712850][oracle.apps.xdo.template.FOProcessor][STATEMENT] End Memory: max=496MB, total=496MB, free=121MB
    [120509_133818606][][EXCEPTION] java.net.SocketException: Socket closed
    at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:99)
    at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
    at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:525)
    at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:504)
    at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:382)
    at weblogic.servlet.internal.ChunkOutput.checkForFlush(ChunkOutput.java:469)
    at weblogic.servlet.internal.ChunkOutput.write(ChunkOutput.java:304)
    at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:139)
    at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:169)
    at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
    at oracle.apps.xdo.servlet.util.IOUtil.readWrite(IOUtil.java:47)
    at oracle.apps.xdo.servlet.CoreProcessor.process(CoreProcessor.java:280)
    at oracle.apps.xdo.servlet.CoreProcessor.generateDocument(CoreProcessor.java:82)
    at oracle.apps.xdo.servlet.ReportImpl.renderBodyHTTP(ReportImpl.java:562)
    at oracle.apps.xdo.servlet.ReportImpl.renderReportBodyHTTP(ReportImpl.java:265)
    at oracle.apps.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:270)
    at oracle.apps.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:250)
    at oracle.apps.xdo.servlet.XDOServlet.doGet(XDOServlet.java:178)
    at oracle.apps.xdo.servlet.XDOServlet.doPost(XDOServlet.java:201)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at oracle.apps.xdo.servlet.security.SecurityFilter.doFilter(SecurityFilter.java:97)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3496)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(Unknown Source)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    It seems where the querry processing is taking some time we are facing this issue.Do i need to perform any additional configuration to generate such reports?

    java.net.SocketException: Socket closed
         at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:99)
         at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
         at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:525)
         at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:504)
         at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:382)
         at weblogic.servlet.internal.CharsetChunkOutput.flush(CharsetChunkOutput.java:249)
         at weblogic.servlet.internal.ChunkOutput.checkForFlush(ChunkOutput.java:469)
         at weblogic.servlet.internal.CharsetChunkOutput.implWrite(CharsetChunkOutput.java:396)
         at weblogic.servlet.internal.CharsetChunkOutput.write(CharsetChunkOutput.java:198)
         at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:139)
         at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:169)
         at com.tej.systemi.util.AroundData.copyStream(AroundData.java:311)
         at com.tej.systemi.client.servlet.servant.Newdownloadsingle.producePageData(Newdownloadsingle.java:108)
         at com.tej.systemi.client.servlet.servant.BaseViewController.serve(BaseViewController.java:542)
         at com.tej.systemi.client.servlet.FrontController.doRequest(FrontController.java:226)
         at com.tej.systemi.client.servlet.FrontController.doPost(FrontController.java:128)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:175)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3498)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(Unknown Source)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:17
    (Please help finding a solution in this issue its in production and we need to ASAP)
    Thanks in Advance
    Edited by: 909601 on Jan 23, 2012 2:05 AM

  • IPad downloading enormous amount of data

    We set my folks up with an iPad2 a few months ago, and it has been using extraordinary amounts of data. They live in the country and broadband and dsl are not available, so we have them set up on a cellular hotspot for internet access, which the iPad views as wi-fi. All was fine with a desktop and a netbook. Once we added an iPad to the mix, their data use has skyrocketed. Recent monitoring is showing approximately 1G per day. They are retired and are only doing facebook, email and an occasional view of a website for shopping. Have turned off auto-play of videos on facebook and shut down everything I can think of that might be using data. Yesterday's activity still showed just over 1G of data downloaded and about 100M uploaded.
    One note - my mother has Parkinson's, and is the primary user of the iPad. Due to the tremors from PD, she tends to tap the screen repeatedly while using. Is it possible that each tap is causing a refresh and downloading everything all over again? Would that result in the enormous download figures?
    I've also seen threads where there were issues with iPhones on AT&T using large amounts of data. Any chance there is a similar problem with an iPad on cellular hotspot? Provider is US Cellular, not AT&T, but makes me curious.
    Thanks for any thoughts or help you can provide.

    Poke around in the settings. Background app refresh can chew data - that's where the iPad reaches out and constantly updates the weather and other apps so that you have fresh data at any point in time....which you don't really need. If you only check the weather once a day you don't need it refreshed every 15 minutes.
    Not a way to find it but maybe a way to control it, put the iPad in airline mode unless you're actively using it. That turns off the antennas and keeps it from accessing data.

  • XY graphing with large amount of data

    I was looking into graphing a fairly substantial amount of data.  It is bursted across serial.
    What I have is 30 values corresponding to remote data sensors.  The data for each comes across together, so I have no problem having the data grouped.  It is, effectively, in an array of size 30.  I've been wanting to place this data in a graph.
    The period varies between 1/5 sec and 2 minutes between receptions (its wireless and mobile, so signal strength varies).  This eliminates waveform graph as the time isn't constant and there's alot of data(So no random NaN insertion).  This leaves me with an XY graph.
    Primary interest is with the last 10 minutes or so, with a desire to see up to an hour or two back.
    The data is farily similar, and I'm tring to possibly split it into groups of 4 or 5 sets of ordered pairs per graph.
    Problems:
    1.  If data comes in slow enough, everything is ok, but the time needed to synchrounously update the graph(s) often can exceed the time it would take to fully recieve the chunk of data which contains these data points.  Thinking asynchrounously is useless, as the graphs need to be reasonably in tune with the most recent data recieved.  I can't have the an exponential growth in the delta of time represented on the graph and the time the last bit of data was recieved.
    2.  I could use some advice on making older data points more sparse to allow for older data to be viewed, but with a sort of 'decay' of old data I don't value that 1/5 second resolution at all.
    I'm most concerned with solving problem 1, but random suggestions on 2 are most welcome.

    I didn't quite get the first question. Could you try to find out where
    exactly the time is consumed in your program and then refine your
    question.
    To the second question (which may also solve the first question). You
    can store all the data to a file as it arrives. Keep the most recent
    data in a let's say shift register of the update loop. Add a data point
    corresponding the start of the mesurement to the data and wire it to a
    XY graph. Make X Scrollbar visible. Handle the event for XY graph:X
    scale change. When the scale is changed i.e. the user scrolls the
    scroll bar, load data from the file and display it on the XY graph.
    This was a little simplified, it's a little more complicated.
    In my project, I am writing an XY graph X Control to handle a bit
    similar issue. I have huge data sets i.e. multichannel recordings 
    with millisecond resolution over many hours. One data set may be
    several gigabytes, so the data won't fit on the main memory of the
    computer at once. I wrote an X control which contains a XY graph. Each
    time the time scale is changed i.e. the scroll bar is scrolled, the X
    control generates a user event if it doesn't have enough data to
    display the time slice requested. The user event is handled outside the
    X Control by loading the appropriate set of data from the disk and
    displaying it on the X Control. The user event can be generated already
    a while before the X Control is out of data. This way the data is
    loaded a bit in advance, which allows seamles scrolling on the XY
    graph. One must notice that the front panel updates must be turned off
    in the X Control when the data is updated and back on after the update
    has finnished. Otherwise the XY graph will flicker annoingly.
    Tomi Maila

  • DSS problems when publishing large amount of data fast

    Has anyone experienced problems when sending large amounts of data using the DSS. I have approximately 130 to 150 items that I send through the DSS to communicate between different parts of my application.
    There are several loops publishing data. One publishes approximately 50 items in a rate of 50ms, another about 40 items with 100ms publishing rate.
    I send a command to a subprogram (125ms) that reads and publishes the answer on a DSS URL (app 125 ms). So that is one item on DSS for about 250ms. But this data is not seen on my man GUI window that reads the DSS URL.
    My questions are
    1. Is there any limit in speed (frequency) for data publishing in DSS?
    2. Can DSS be unstable if loaded to much?
    3. Can I lose/miss data in any situation?
    4. In the DSS Manager I have doubled the MaxItems and MaxConnections. How will this affect my system?
    5. When I run my full application I have experienced the following error Fatal Internal Error : ”memory.ccp” , line 638. Can this be a result of my large application and the heavy load on DSS? (se attached picture)
    Regards
    Idriz Zogaj
    Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
    Memory Profesional
    direct: +46 (0) - 734 32 00 10
    http://www.zogaj.se

    LuI wrote:
    >
    > Hi all,
    >
    > I am frustrated on VISA serial comm. It looks so neat and its
    > fantastic what it supposes to do for a develloper, but sometimes one
    > runs into trouble very deep.
    > I have an app where I have to read large amounts of data streamed by
    > 13 µCs at 230kBaud. (They do not necessarily need to stream all at the
    > same time.)
    > I use either a Moxa multiport adapter C320 with 16 serial ports or -
    > for test purposes - a Keyspan serial-2-USB adapter with 4 serial
    > ports.
    Does it work better if you use the serial port(s) on your motherboard?
    If so, then get a better serial adapter. If not, look more closely at
    VISA.
    Some programs have some issues on serial adapters but run fine on a
    regular serial port. We've had that problem recent
    ly.
    Best, Mark

  • Freeze when writing large amount of data to iPod through USB

    I used to take backups of my PowerBook to my 60G iPod video. Backups are taken with tar in terminal directly to mounted iPod volume.
    Now, every time I try to write a big amount of data to iPod (from MacBook Pro), the whole system freezes (mouse cursor moves, but nothing else can be done). When the USB-cable is pulled off, the system recovers and acts as it should. This problem happens every time a large amount of data is written to iPod.
    The same iPod works perfectly (when backupping) in PowerBook and small amounts of data can be easily written to it (in MacBook Pro) without problems.
    Does anyone else have the same problem? Any ideas why is this and how to resolve the issue?
    MacBook Pro, 2.0Ghz, 100GB 7200RPM, 1GB Ram   Mac OS X (10.4.5)   IPod Video 60G connected through USB

    Ex PC user...never had a problem.
    Got a MacBook Pro last week...having the same issues...and this is now with an exchanged machine!
    I've read elsewhere that it's something to do with the USB timing out. And if you get a new USB port and attach it (and it's powered separately), it should work. Kind of a bummer, but, those folks who tried it say it works.
    Me, I can upload to Ipod piecemeal, manually...but even then, it sometimes freezes.
    The good news is that once the Ipod is loaded, the problem shouldnt' happen. It's the large amounts of data.
    Apple should DEFINITELY fix this though. Unbelievable.
    MacBook Pro 2.0   Mac OS X (10.4.6)  

  • Getting Short dump due to large amount of data

    Hi experts,
    When we are running RALM_ME_MEASP_FULL_DOWNLOAD_SD  program, every time we are getting Short dump due to large amount of data.
    please suggest how to run this program without short dump.
    Thanks & Regards
    Prashant Gupta

    Hi,
    you do run the wrong APP I guess. The service you are running is for MAM20. If you are interested in MAM30 and MAM25, please use the correct service.
    If you have a look for
    *FULL_DOWNLOAD_SD
    You find them all. Use the ones without the RALM in front - then the timeout should be solved as well. Bside that there is a great guid available that helps as well to solve some issues around server driven replication:
    [MAM SERVER DRIVEN SETUP GUIDE|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/818ac119-0b01-0010-ba8b-b6e3f3490a63?quicklink=index&overridelayout=true]
    Hope that helps to solve yourt issue.
    Regards,
    Oliver

  • IPhone using extremely high amounts of data when not in use

    I have a 4S on AT&T and am grandfathered in w/ the unlimited data plan. Last month I got the notification that I was in the top 5% of data users (which sounds crazy to me considering what some other people I've read about seem to use). According to AT&T's site I had used 2.09 GB with 16 days left in my billing cycle.
    Once I got that notification I logged on to see if I could find out when I was using such large amounts of data. It turns out between midnight and 1 am almost every night there was large amounts of data sent. I think this is strange since for two reasons, 1) I'm on WiFi at home in my apartment and 2) I wasn't even awake at the times these charges occurred.
    Example: January 6, 2012 at 12:40 am I was charged for 417,327kb, roughly 417MB.
    Similar data usage occurred almost every night going back to 12/28/11. On 12/29 I was charged for 468MB at 12:21am. Very unlikely this was actually data I used since I had work the next day and wasn't awake. It honestly looks like 80-85% of my data usage is coming from these occurrences. It also isn't likely that this is a total of data used throughout the day as there are other entries of smaller amounts spread out throughout the day.
    Now, if I was on a truly unlimited plan and there was no such thing as throttling I really wouldn't care about this. But the fact is that my 3G speeds are being throttled (just ran speed test at a location I used to get over 1Mbps and I am was at .07 Mbps). I have spoken to AT&T and they insisted it was an issue w/ my phone, either the hardware or the OS. So I went to apple and the "Genius" did a DFU restore for me in store and told me that would fix it. It hasn't and now I'm stuck with this constantly happening and unbearably slow 3G data speeds for the rest of the month.
    That was my last billing cycle where I was throttled down to download speeds of .07mbps. Unusable. This month I decided to closely monitor my data usage. I turned off iCloud and photostream, and turn off data whenever I'm using wifi. I turned off the sending info to apple setting.
    I checked my account on ATT.com today and noticed that yesterday morning at 7:58 am while driving to work I somehow used 247 MB. Don't ask me how, I'm not steaming anything and my phone is either locked or playing music from the iPod app. I could probably stream Netflix for 5 hours and not use that much data.
    I called AT&T to let them know (AGAIN) and got bumped up to a "Data manager" who was incredibly b*tchy and rude. She said there must be some app that's using that much data to update. I don't have any apps that are even that size so I don't see how this could possibly be the case. I tried to talk with her about how that can't really be the case and she just kept repeating that there must be some app using the data blah blah. I've gone to apple with this last month after failing with AT&T and at first they did a DFU restore and we restored the phone as new, then it kept happening so I went back they replaced my phone. AT&T said it must be a hardware or software issue last month, but this month it must be an app that's doing this. I can't get a straight answer from them and they just keep passing the ball to say it's Apple's problem.
    I really don't know what to do. I'm grandfathered in on the unlimited plan and don't really want to change that. Say I change my plan to the 3GB's at the same $30/month. What's to say this won't keep happening and I'll be over that 3GB's in 10 days then get charged an extra $10 for each additional GB?!
    I can't think of any apps I've downloaded in the past few months that would result in this change and I can't live with another 2/3 of my billing cycle being throttled down to unusable speeds.
    Has anybody else had anything similar happen? I have no idea what to do next (and sorry for the long rant but I'm fresh off the phone with AT&T and I'm ******)

    Thanks for the link. I only went through the last three pages and unfortunately it looks like there is no solution. I turn data off every night and turn it on only when I'm not in a wifi area. It seems that as soon as I turn it on, within an hour it charges me data for a backlog of whatever it didn't do when it was connected to wifi.
    I think it's absolutely disgusting that AT&T can just dismiss this and act like it's the users fault. Apps and email updating in the middle of the night to the tune of 400MB? I highly doubt it and then when I called to ask them about it they try to make me sound stupid like I have an app or two open. Just really ticked me off. The worst part about it is that there doesn't appear to be anything I can do about it aside from never using my data.

  • Issue with time sheet data source

    Hi All,
      We are on ECC 5.0.
      I am trying to load CATS timesheet data into BW thorugh data sources 0CA_TS_1 (approved time). This data source fetches data from CATSDB table.
    In CATSDB there is a good amount of data with Status '30' i.e. 'Approved'.
    However on running RSA3 checker this datasource is returning 0 records.
    The only SAP note most relevant was 509592. But it is valid for older versions.
    The ECC HR is on Level 10 and support pack SAPKE50010.
    This issue has been unresolved in previous threads.
    Please suggest any solution. Its urgent. Points for sure.
    Thanks
    Vishno

    Hi Guillaume 
    Thanks for the reply
    I had a glimpse on above link.
    1) Above link says need to perform below steps. Where we need to run below steps (SQL Server or AD Server)
    Sorry for asking basic question
    Click Start, click Run, type Adsiedit.msc, and then click
    OK.
    In the ADSI Edit snap-in, expand Domain [<var>DomainName</var>], expand
    DC= <var>RootDomainName</var>, expand CN=Users, right-click
    CN= <var>AccountName</var>, and then click Properties. 
    In the CN= <var>AccountName</var> Properties dialog box, click the
    Security tab.
    On the Security tab, click Advanced.
    In the Advanced Security Settings dialog box, make sure that
    SELF is listed under Permission entries.
    2) Link talks about Windows Server 2000 version. Is this applicable for 2008 ?
    Regards
    Santosh

  • Couldn't copy large amount of data from enterprise DB to Oracle 10g

    Hi,
    I am using i-batis to copy data from enterprise DB to oracle and viceversa.
    The datatype of a field on EDB is 'text' and the datatype on oracle is 'SYS.XMLTYPE'
    i am binding these to a java string property in a POJO to bind values.
    I could successfully copy limited amount of data from EDB to oracle but if there is more data, i am getting the following exceptions with different oracle drivers ( but i could read large amount of data from EDB):
    --- Cause: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.update(SqlMapExecutorDelegate.java:457)
    at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.update(SqlMapSessionImpl.java:90)
    at com.ibatis.sqlmap.engine.impl.SqlMapClientImpl.update(SqlMapClientImpl.java:66)
    at com.aqa.pojos.OstBtlData.updateOracleFromEdbBtlWebservice(OstBtlData.java:282)
    at com.aqa.pojos.OstBtlData.searchEdbAndUpdateOracleBtlWebservice(OstBtlData.java:258)
    com.ibatis.common.jdbc.exception.NestedSQLException:
    --- The error occurred in com/aqa/sqlmaps/SQLMaps_OSTBTL_Oracle.xml.
    --- The error occurred while applying a parameter map.
    --- Check the updateOracleFromEDB-InlineParameterMap.
    --- Check the parameter mapping for the 'btlxml' property.
    --- Cause: java.sql.SQLException: setString can only process strings of less than 32766 chararacters
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.iba
    I have latest oracle 10g jdbc drivers.
    remember, i could copy any amount of data from oracle to EDB but not otherway around.
    PLease let me know if you have come across this issue, any recommendation is very much appreciated.
    Thanks,
    CK.

    Hi,
    I finally remembered how I solved this issue previously.
    The jdbc driver isn't able to directly call the insert with a column xml_type. The solution I was using was to build a wrapper function in plSQL.
    Here it is (for insert but I suppose tha update will be the same)
    create or replace procedure insertXML(file_no_in in number, program_no_in in varchar2, ost_XML_in in clob, btl_XML_in in clob) is
    begin
    insert into AQAOST_FILES (file_no,program_no,ost_xml,btl_xml) values(file_no_in, program_no_in, xmltype(ost_XML_in), xmltype(btl_XML_in));
    end insertXML;
    here is the sqlmap file I used
    <?xml version="1.0" encoding="UTF-8" ?>
    <!DOCTYPE sqlMap
    PUBLIC "-//ibatis.apache.org//DTD SQL Map 2.0//EN"
    "http://ibatis.apache.org/dtd/sql-map-2.dtd">
    <sqlMap>
         <typeAlias alias="AqAost" type="com.sg2net.jdbc.AqAost" />
         <insert id="insert" parameterClass="AqAost">
              begin
                   insertxml(#fileNo#,#programNo#,#ostXML:CLOB#,#bltXML:CLOB#);
              end;
         </insert>
    </sqlMap>
    an here is a simple program
    package com.sg2net.jdbc;
    import java.io.IOException;
    import java.io.Reader;
    import java.io.StringWriter;
    import java.sql.Connection;
    import oracle.jdbc.pool.OracleDataSource;
    import com.ibatis.common.resources.Resources;
    import com.ibatis.sqlmap.client.SqlMapClient;
    import com.ibatis.sqlmap.client.SqlMapClientBuilder;
    public class TestInsertXMLType {
         * @param args
         public static void main(String[] args) throws Exception {
              // TODO Auto-generated method stub
              String resource="sql-map-config-xmlt.xml";
              Reader reader= Resources.getResourceAsReader(resource);
              SqlMapClient sqlMap = SqlMapClientBuilder.buildSqlMapClient(reader);
              OracleDataSource dataSource= new OracleDataSource();
              dataSource.setUser("test");
              dataSource.setPassword("test");
              dataSource.setURL("jdbc:oracle:thin:@localhost:1521:orcl");
              Connection connection=dataSource.getConnection();
              sqlMap.setUserConnection(connection);
              AqAost aqAost= new AqAost();
              aqAost.setFileNo(3);
              aqAost.setProgramNo("prg");
              Reader ostXMLReader= Resources.getResourceAsReader("ostXML.xml");
              Reader bltXMLReader= Resources.getResourceAsReader("bstXML.xml");
              aqAost.setOstXML(readerToString(ostXMLReader));
              aqAost.setBltXML(readerToString(bltXMLReader));
              sqlMap.insert("insert", aqAost);
              connection.commit();
         public static String readerToString(Reader reader) {
              StringWriter writer = new StringWriter();
              char[] buffer = new char[2048];
              int charsRead = 0;
              try {
                   while ((charsRead = reader.read(buffer)) > 0) {
                        writer.write(buffer, 0, charsRead);
              } catch (IOException ioe) {
                   throw new RuntimeException("error while converting reader to String", ioe);
              return writer.toString();
    package com.sg2net.jdbc;
    public class AqAost {
         private long fileNo;
         private String programNo;
         private String ostXML;
         private String bltXML;
         public long getFileNo() {
              return fileNo;
         public void setFileNo(long fileNo) {
              this.fileNo = fileNo;
         public String getProgramNo() {
              return programNo;
         public void setProgramNo(String programNo) {
              this.programNo = programNo;
         public String getOstXML() {
              return ostXML;
         public void setOstXML(String ostXML) {
              this.ostXML = ostXML;
         public String getBltXML() {
              return bltXML;
         public void setBltXML(String bltXML) {
              this.bltXML = bltXML;
    I tested the insert and it works correctly
    ciao,
    Giovanni

  • XML-Export Error for large amount of data

    Hi there...
    I have an application process, which runs on demand (Button) and which exports data (from sql query) into a file (.xls).
    The result is being formated and the export works fine as long as the query returns just a small amount of data, approx. 8 to 10 rows.
    As the result is being stored in a clob, I output the data with "htp.prn" in a loop by "cutting" the clob into small pieces (varchar).
    However, as soon as the amount is bigger than the mentioned 8 to 10 rows, I get an error (sqlerrm:ORA-06502: PL/SQL: numeric or value error).
    I guess there must be something wrong with my loop or the way I "cut" the clob into pieces and output them.
    Maybe someone has a hint for me where to look at exactly!?
    Thanks in advance...
    Johnny
    Here is my code (I removed parts of it, which are not important for this issue):
    declare
    l_xml_header varchar2(32767);
    l_xml_body clob;
    l_xml_text varchar2(32767);
    l_xml_footer varchar2(32767);
    runner number;
    clob_size number;
    begin
    runner := 2;
    owa_util.mime_header( 'application/octet', FALSE);
    htp.p('Content-Disposition: attachment; filename="Test.xls"');
    owa_util.http_header_close;
    l_xml_header := '<?xml version="1.0" encoding="utf-8"?>'||chr(10)||
    '<?mso-application progid="Excel.Sheet"?>'||chr(10)||
    '<Workbook xmlns="urn:schemas-microsoft-com:office:spreadsheet"'||chr(10)||
    'xmlns:o="urn:schemas-microsoft-com:office:office"'||chr(10)||
    'xmlns:x="urn:schemas-microsoft-com:office:excel"'||chr(10)||
    'xmlns:ss="urn:schemas-microsoft-com:office:spreadsheet"'||chr(10)||
    'xmlns:html="http://www.w3.org/TR/REC-html40">'||chr(10)||
    '<DocumentProperties xmlns="urn:schemas-microsoft-com:office:office">'||chr(10)||
    '<Version>1.0</Version>'||chr(10)||
    '</DocumentProperties>'||chr(10)||
    '<ExcelWorkbook xmlns="urn:schemas-microsoft-com:office:excel">'||chr(10)||
    '<WindowHeight>8580</WindowHeight>'||chr(10)||
    '<WindowWidth>15180</WindowWidth>'||chr(10)||
    '<WindowTopX>120</WindowTopX>'||chr(10)||
    '<WindowTopY>45</WindowTopY>'||chr(10)||
    '<ProtectStructure>False</ProtectStructure>'||chr(10)||
    '<ProtectWindows>False</ProtectWindows>'||chr(10)||
    '</ExcelWorkbook>'||chr(10)||
    '<Styles>'||chr(10)||
    '<Style ss:ID="Default" ss:Name="Normal">'||chr(10)||
    '<Alignment ss:Vertical="Bottom"/>'||chr(10)||
    '<Borders/>'||chr(10)||
    '<Font ss:FontName="Arial" x:Family="Swiss"/>'||chr(10)||
    '<Interior/>'||chr(10)||
    '<NumberFormat/>'||chr(10)||
    '<Protection/>'||chr(10)||
    '</Style>'||chr(10)||
    '<Style ss:ID="s22">'||chr(10)||
    '<Font x:Family="Swiss" ss:Bold="1"/>'||chr(10)||
    '</Style>'||chr(10)||
    '<Style ss:ID="s67">'||chr(10)||
    '<Font ss:FontName="Arial" x:Family="Swiss" ss:Color="#FFFFFF"/>'||chr(10)||
    '</Style>'||chr(10)||
    '<Style ss:ID="s157">'||chr(10)||
    '<Borders/>'||chr(10)||
    '</Style>'||chr(10)||
    '<Style ss:ID="s158">'||chr(10)||
    '<Borders>'||chr(10)||
    '<Border ss:Position="Right" ss:LineStyle="Continuous" ss:Weight="1"/>'||chr(10)||
    '</Borders>'||chr(10)||
    '</Style>'||chr(10)||
    '</Styles>';
    for z in 1..1
    loop
    l_xml_body:=l_xml_body||'<Worksheet ss:Name="Worksheet1"> <Table x:FullColumns="1" x:FullRows="1" ss:DefaultColumnWidth="60">';
    l_xml_body:=l_xml_body||'<Row><Cell ss:StyleID="s163"><Data ss:Type="String">Colum1</Data></Cell>'||
    '<Cell ss:StyleID="s163"><Data ss:Type="String">Colum2</Data></Cell>'||
    '<Cell ss:StyleID="s163"><Data ss:Type="String">Colum3</Data></Cell>'||
    '<Cell ss:StyleID="s163"><Data ss:Type="String">...</Data></Cell>'||
    '<Cell ss:StyleID="s166"><Data ss:Type="String">ColumN</Data></Cell></Row>';
    for z in (
    select
    a."Col1",
    a."Col2",
    b."Col3",
    b."ColN"
    from table1 a,
    table2 b
    where a.id = b.id
    loop
    l_xml_body := l_xml_body||'<Row><Cell ss:StyleID="s157"><Data ss:Type="String">'||
    z.Col1||'</Data></Cell><Cell ss:StyleID="s157"><Data ss:Type="String">'||
    z.Col2||'</Data></Cell><Cell ss:StyleID="s157"><Data ss:Type="String">'||
    z.Col3||'</Data></Cell><Cell ss:StyleID="s157"><Data ss:Type="String">'||
    ... ||'</Data></Cell><Cell ss:StyleID="s157"><Data ss:Type="String">'||
    z.ColN||'</Data></Cell>';
    l_xml_body := l_xml_body||'</Row>'||chr(10);
    runner := runner + 1;
    end loop;
    l_xml_body := l_xml_body||'</Table>';
    end loop;
    clob_size := dbms_lob.getlength(l_xml_body);
    htp.prn(l_xml_header);
    for i in 1..ceil(clob_size / 32767)
    loop
    l_xml_text := dbms_lob.SUBSTR (l_xml_body, 32767, v_count);
    HTP.prn (l_xml_text);
    v_count := v_count + 32767;
    end loop;
    htp.prn('</Worksheet></Workbook>');
    HTMLDB_APPLICATION.g_unrecoverable_error := TRUE;
    EXCEPTION
    WHEN OTHERS
    THEN
    OWA_UTIL.mime_header ('application/octet', FALSE);
    HTP.prn ('Content-Disposition: attachment; filename="Test.xls"');
    OWA_UTIL.http_header_close;
    HTMLDB_APPLICATION.g_unrecoverable_error := TRUE;
    end;
    #######################################################

    Thanks for the hint Paul,
    here is my code in code-tags.
    I appreciate any help!
    Johnny
    declare
    l_xml_header varchar2(32767);
    l_xml_body clob;
    l_xml_text varchar2(32767);
    l_xml_footer varchar2(32767);
    runner number;
    clob_size number;
    begin
    runner := 2;
    owa_util.mime_header( 'application/octet', FALSE);
    htp.p('Content-Disposition: attachment; filename="Test.xls"');
    owa_util.http_header_close;
    l_xml_header := '<?xml version="1.0" encoding="utf-8"?>'||chr(10)||
    '<?mso-application progid="Excel.Sheet"?>'||chr(10)||
    '<Workbook xmlns="urn:schemas-microsoft-com:office:spreadsheet"'||chr(10)||
    'xmlns:o="urn:schemas-microsoft-com:office:office"'||chr(10)||
    'xmlns:x="urn:schemas-microsoft-com:office:excel"'||chr(10)||
    'xmlns:ss="urn:schemas-microsoft-com:office:spreadsheet"'||chr(10)||
    'xmlns:html="http://www.w3.org/TR/REC-html40">'||chr(10)||
    '<DocumentProperties xmlns="urn:schemas-microsoft-com:office:office">'||chr(10)||
    '<Version>1.0</Version>'||chr(10)||
    '</DocumentProperties>'||chr(10)||
    '<ExcelWorkbook xmlns="urn:schemas-microsoft-com:office:excel">'||chr(10)||
    '<WindowHeight>8580</WindowHeight>'||chr(10)||
    '<WindowWidth>15180</WindowWidth>'||chr(10)||
    '<WindowTopX>120</WindowTopX>'||chr(10)||
    '<WindowTopY>45</WindowTopY>'||chr(10)||
    '<ProtectStructure>False</ProtectStructure>'||chr(10)||
    '<ProtectWindows>False</ProtectWindows>'||chr(10)||
    '</ExcelWorkbook>'||chr(10)||
    '<Styles>'||chr(10)||
    '<Style ss:ID="Default" ss:Name="Normal">'||chr(10)||
    '<Alignment ss:Vertical="Bottom"/>'||chr(10)||
    '<Borders/>'||chr(10)||
    '<Font ss:FontName="Arial" x:Family="Swiss"/>'||chr(10)||
    '<Interior/>'||chr(10)||
    '<NumberFormat/>'||chr(10)||
    '<Protection/>'||chr(10)||
    '</Style>'||chr(10)||
    '<Style ss:ID="s22">'||chr(10)||
    '<Font x:Family="Swiss" ss:Bold="1"/>'||chr(10)||
    '</Style>'||chr(10)||
    '<Style ss:ID="s67">'||chr(10)||
    '<Font ss:FontName="Arial" x:Family="Swiss" ss:Color="#FFFFFF"/>'||chr(10)||
    '</Style>'||chr(10)||
    '<Style ss:ID="s157">'||chr(10)||
    '<Borders/>'||chr(10)||
    '</Style>'||chr(10)||
    '<Style ss:ID="s158">'||chr(10)||
    '<Borders>'||chr(10)||
    '<Border ss:Position="Right" ss:LineStyle="Continuous" ss:Weight="1"/>'||chr(10)||
    '</Borders>'||chr(10)||
    '</Style>'||chr(10)||
    '</Styles>';
    for z in 1..1
    loop
      l_xml_body:=l_xml_body||'<Worksheet ss:Name="Worksheet1"> <Table x:FullColumns="1" x:FullRows="1" ss:DefaultColumnWidth="60">';
      l_xml_body:=l_xml_body||'<Row><Cell ss:StyleID="s163"><Data ss:Type="String">Colum1</Data></Cell>'||
                              '<Cell ss:StyleID="s163"><Data ss:Type="String">Colum2</Data></Cell>'||
                              '<Cell ss:StyleID="s163"><Data ss:Type="String">Colum3</Data></Cell>'||
                              '<Cell ss:StyleID="s163"><Data ss:Type="String">...</Data></Cell>'||
                              '<Cell ss:StyleID="s166"><Data ss:Type="String">ColumN</Data></Cell></Row>';
      for z in (
      select
       a."Col1",
       a."Col2",
       b."Col3",
       b."ColN"
      from table1 a,
           table2 b
      where a.id = b.id
      loop
          l_xml_body := l_xml_body||'<Row><Cell ss:StyleID="s157"><Data ss:Type="String">'||
                            z.Col1||'</Data></Cell><Cell ss:StyleID="s157"><Data ss:Type="String">'||
                            z.Col2||'</Data></Cell><Cell ss:StyleID="s157"><Data ss:Type="String">'||
                            z.Col3||'</Data></Cell><Cell ss:StyleID="s157"><Data ss:Type="String">'||
                             ...  ||'</Data></Cell><Cell ss:StyleID="s157"><Data ss:Type="String">'||
                            z.ColN||'</Data></Cell>';
          l_xml_body := l_xml_body||'</Row>'||chr(10);
          runner := runner + 1;  
    end loop;
        l_xml_body := l_xml_body||'</Table>';
    end loop;
    clob_size           := dbms_lob.getlength(l_xml_body);
    htp.prn(l_xml_header);
    for i in 1..ceil(clob_size / 32767)
    loop
       l_xml_text := dbms_lob.SUBSTR (l_xml_body, 32767, v_count);
       HTP.prn (l_xml_text);
       v_count := v_count + 32767;
    end loop;
    htp.prn('</Worksheet></Workbook>');
    HTMLDB_APPLICATION.g_unrecoverable_error := TRUE;
       EXCEPTION
          WHEN OTHERS
          THEN
             OWA_UTIL.mime_header ('application/octet', FALSE);
             HTP.prn ('Content-Disposition: attachment; filename="Test.xls"');
             OWA_UTIL.http_header_close;
             HTMLDB_APPLICATION.g_unrecoverable_error := TRUE;
    end;

  • Transporting large amounts of data from one database schema to another

    Hi,
    We need to move large amount of data from one 10.2.0.4 database schema to another 11.2.0.3 database.
    Am currently using datapump but quite slow still - having to do in chunks.
    Also the datapump files quite large so having to compress and move across the network.
    Is there a better/quicker way?
    Habe haerd about transportable tablespaces but never used them and don't know about speed - if quicker thana datapump.
    tablespace names different in both databases.
    Also source database on solaris opertaing system on sun box
    target database on aix on ibm power series box.
    Any ideas would be great.
    Thanks
    Edited by: user5716448 on 08-Sep-2012 03:30
    Edited by: user5716448 on 08-Sep-2012 03:31

    user5716448 wrote:
    Hi,
    We need to move large amount of data from one 10.2.0.4 database schema to another 11.2.0.3 database.
    Pl quantify "large".
    Am currently using datapump but quite slow still - having to do in chunks.
    Pl quantify "quite slow".
    Also the datapump files quite large so having to compress and move across the network.
    Again, pl quantify "quite large".
    Is there a better/quicker way?
    Habe haerd about transportable tablespaces but never used them and don't know about speed - if quicker thana datapump.
    tablespace names different in both databases.
    Also source database on solaris opertaing system on sun box
    target database on aix on ibm power series box.
    It may be possible, assuming you do not violate any of these conditions
    http://docs.oracle.com/cd/E11882_01/server.112/e25494/tspaces013.htm#ADMIN11396
    Any ideas would be great.
    Thanks
    Edited by: user5716448 on 08-Sep-2012 03:30
    Edited by: user5716448 on 08-Sep-2012 03:31Master Note for Transportable Tablespaces (TTS) -- Common Questions and Issues [ID 1166564.1]
    HTH
    Srini

Maybe you are looking for

  • Server crashes when I do network "find"

    I have a network of five computers, three power PC's and two laptops. No matter which machine I attempt the "find" from, it automatically crashes the server. Any suggestions?

  • Is FCE 3.5 able to use a 2TB external hard drive for capture?

    Having strange problems capturing on my newest (& biggest) external drive, a 2 TB G-raid.  Even if I set the G-raid as the only scratch drive, FCE captures to the Mac hard drive.  I am now getting past the midpoint of this drive's capacity, that is t

  • What could cause it?

    I've been watching a video on my ipad... and the last 7 minutes are blank, just a black screen and no audio, yet on itunes it's there with all the audio and video, My question is, what happened?

  • Print margin fault

    since I got a new pc   an HP pavilion 22   when I print something its leaving an margin on the right hand side which is causing me to waste a lot of card.  My printer is a HP Deskjet 2540  I have tried everything to sort this but cant. Can anyone hel

  • Standard Image for Deployment

    Dear All...........I am looking to build a solution for our helpdesk who have to install OS and Applications to customers laptops on daily basis. At the moment, I am trying to understand different aspects so there are some confusions which I would li