Problem about "huge" data

does oracle spatial can work with huge data, i am no trying to do overlay on two geometry column, one of them has more than twenty thousand record. I have valid these data but when i do overlay with SDO_GEOM.SDO_INTERSECTION() then the system halt without any responce, why ?????

I've found new solution. Use WebDocument for migration

Similar Messages

  • Problems about Hold Data

    Hi All,
    I'm confused by the usage of "Hold Data" menu. I ran the t-code SE16 to check the table T002. In the selection screen I input "DE" in the SPRS field and then click on the menu System -> User Profile -> Hold Data. System indicates that "Data was held". But the next time when I enter the selection screen of table T002 in the same session, the SPRS field is still empty. Does anybody know the reason?
    Thanks + Best Regards
    Jerome
    null

    Hi Jerome,
    <b>If Hold data is not activated in the screen attributes, the menu entries like hold data, set data will have no effect.</b>
    Have a look at below link.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/c1/07c05878a711d2958a0000e8353423/content.htm
    I hope it helps.
    Best Regards,
    Vibha
    *Please mark all the helpful answers

  • A problem about filter data in RPD by Roles

    Dear All,
    I face a serious problem.
    How I set the filter in RPD if there are several member I dont want to receive in different Generation?
    Assume I want receive A, B in Gen4, Accounts. C in Gen5, Accounts. And D, E, D in Gen6, Accounts?
    I tried to set it up in a few way, but the result didnt come out right. Some Gen4, Accounts should show but it didnt.
    Best Regards,
    Martin

    Hi Srihari,
    Thanks for your reply.
    And also I have tried with Object Model to display the report.
    Here field catalog will not come into place.
    Only the following lines involves for the ALV.
    Yet the issue comes in the same way.
    cl_salv_table=>factory( importing r_salv_table = gr_table changing t_table = IG_ALV ).
    gr_functions = gr_table->get_functions( ).
    gr_functions->set_all( abap_true ).
    gr_display = gr_table->get_display_settings( ).
    gr_display->set_striped_pattern( cl_salv_display_settings=>true ).
    gr_display->set_list_header( 'EDI IDOC').
    gr_columns = gr_table->get_columns( ).
    gr_table->display( ).

  • Help! problem about jstl sql with "LIKE?" in query

    Hi All,
    I have a problem about getting data by using "LIKE" in my sql statment.
    here is my case:
    <sql:query var="tmp">
    SELECT ...... FROM ...
    WHERE a LIKE ?
    <sql:param value="%${param.a}%"/>
    </sql:query>
    Once I used "LIKE" keyword, the query failed to use this critica.
    and couldn't find any match cases
    thx or help
    Micheal

    besides, i found that:
    this works:
    "AND a.block LIKE '%' + 'a' + '%'"
    but these don't work:
    "AND a.block LIKE '%' + 'a' + '' + '%'"
    or
    "AND a.block LIKE '' + '%' + 'cp' + '%'"
    or
    "AND a.block LIKE '%' + 'cp' + '%' + ''"
    it seems '' is the casue of error... so strange, anyone has idea?
    micheal

  • Problem About Date Format in Dashboard

    Hi All:
    I still have problem about date format in dashboard and dashboard prompt. When I use cast(dealdate as date) in dashboard prompt(coloumn formula of deal_date) and filter view(In edit view of filter coloumn formula) Then It converts the date format from mm/dd/yyyy to yyyy-mm-dd But I want format in dd-mm-yyyy.
    ANd also one problem more that is when I use cast function in filter coloumn formula then how I give alias to it. Because it shows same like that cast(deal_date as date) on dashboard filter view.
    Pluzz help me to solve this problem
    Haroon

    Changing the ini files should change the date format. Have you amended the correct instances of the date format? In dbFeatures there is one for every ODBC connection.
    For example in my DBFeatures.INI
    [ DATA_SOURCE_FEATURE = ODBC_300 ]
    DATE_FORMAT = 'dd-mm-yyyy' ;
         TIME_FORMAT = 'hh:mi:ss' ;
         DATE_TIME_FORMAT = 'dd-mm-yyyy hh:mi:ss' ;
         IDENTIFIER_QUOTE_CHAR = '"';Also a file on the presentation side in OracleBI \web\Config called localedefinitions.xml.
    Inside that there appears to be defaults for localisations - maybe it is possible to amend your locale in the Administration settings of the dashboard.
    However, I haven't tried this sorry.

  • Have problem about grant assignment in Student Master Data

    have problem about grant assignment in Student Master Data
    i want to change grant detail in Student Master Data when i choose disbursement type to Pay Fixed Amount
    and then go to Grant Evaluation Run it have result
    But when i choose another disbursement type such as
    1. Pay Up To a Certain Percentge
    2. Pay Certain Percentage Up to a Certain Amount
    3. Pay Up to a Certain Amount
    and go to run Grant Evaluation it can run but not have result
    I need help T_T i don't know why i can't change or i miss something

    hello experts,
    Our problem is still not fixed, but we have contacted SAP to have a look at this problem. In the mean time we have created a new way to calculate the grants by doing no check at all, because our client don't want to have prerequisites yet.
    Therefor we made a custom function module from PMIQBP_GR_COND_EVALT_01. If the boolean ev_xok is true the system will calculate and post the grant amount.
    Good luck and I'll keep you posted.
    Kind regards,
    Steve de Klonia

  • Huge data usage with Verison 4G LTE hotspot ( Verizon-890L-50DB)

    Recently upgraded my internet service to a Verizon 4G LTE hotspot (Verizon-890L-50DB).  First month - no problems - second month - I have 6GB of overage charges.  Every time I log in to check email or any other minimal data usage task, I get hit with another GB or more of download usage.  My computer is an HP Pavilion Laptop, Windows 7 with NETWORX installed.  Networx tells me it is indeed MY computer that is downloading this data and my hard drive is also showing the increases in data.  
    I've run AVG, Glary Utilities, Speedy PC PRO and Spybot Search and Destroy but found no virus, rootkit or anything on my computer that would explain this problem. 
    When I use a different wireless internet service, I do NOT see the extraneous data usage.
    I contacted Verizon's support team and got no help diagnosing the problem but they did agree to send a replacement for my hotspot device.  Hopefully this fixes the issue; then I have to battle with them about the unwarranted $60 or more in overage charges.
    Has anyone else experienced a problem similar to this?

    I would recomend getting out of the contract before the 14 day trial period ends.   Verizon will charge you an activation fee, restocking fee and one months service, but that is better than being stuck with this mess.  I have homefusion and am afraid to use since verizon seems to fabricate data usage.  Unfortunately I did not realize this untill after the 14 day trial.  Now is will cost me $350 to terminate my contract.
    Date: Mon, 18 Feb 2013 21:21:48 -0700
    From: [email protected]
    To: <Email address removed for privacy.>
    Subject: Re: Huge data usage with Verison 4G LTE hotspot ( Verizon-890L-50DB) - Re: Huge data usage with Verison 4G LTE hotspot ( Verizon-890L-50DB)
                                                                                    Re: Huge data usage with Verison 4G LTE hotspot ( Verizon-890L-50DB)
        created by Lyda1 in Verizon Jetpack 4G LTE Mobile Hotspot 890L - View the full discussion
    Exactly the same thing has happened to me.  I purchased the Verizon Jetpack™ 4G LTE Mobile Hotspot 890L two days ago.  After one day, I had supposedly used half the 5GB monthly allowance.  After two days, I am at 4.25 GB usage.  I don't stream movies, I have the hotspot password protected, I live alone, and no one else uses my computer.  I have not downloaded any large files.  At this rate, I'll go broke soon.
    Reply to this message by replying to this email -or- go to the message on Verizon Wireless Community
    Start a new discussion in Verizon Jetpack 4G LTE Mobile Hotspot 890L by email or at Verizon Wireless Community
    © 2011 Verizon Wireless
    Verizon Wireless | One Verizon Way | Mail Code: 180WVB | Basking Ridge, NJ 07920
    We respect your privacy.  Please review our privacy policy for more information.
                                 Not interested in these emails anymore, or want to change how often they come? Update your email preferences.
    Message was edited by: Verizon Moderator

  • Transferin Huge data via Java sockets! Problematic!

    Hello!
    I tried to write a server app in Java to get huge data which contains a meta data XML format about file information and then binary data of file...
    I get data from socket with DataInputStream and then I read like this:
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    while((got = in.read(res)) != -1)
    out.write(res, 0, got);
    byte[] recieved = out.toByteArray();
    I saw that if I store all receiving data in array it takes alot of memory and I cannot even transfer files which have 650MB of data.
    So I decided to write what I get from socket directly to a file...
    I used FileOutputStream to do it...
    But problem is -> THIS SERVER IS SO SLOW!
    I wrote similar code in VB6... And it receives data from client SO SO SO faster than Java...
    What's wrong with my code which makes works slow?
    how can I solve it?
    You experts have any server code example for receiving huge amount of data?
    Please advice...
    Thanks alot

    How to use buffered streams: go to Java tutorials http://java.sun.com/docs/books/tutorial/index.html - look for "basic input/output", and "buffered streams" under it. You'll want BufferedInputStream and BufferedOutputStream.
    Alternatively:
            byte buf = new byte[8192];
            while (true) {
                int count = in.read(buf);
                if (count == -1)
                    break;
                out.write(buf, 0, count);
            }If at all possible, don't read the whole file into memory at once. Read with a loop like the above and do whatever you are supposed to do with the data a chunk at a time. But if you must have it in memory then I guess you got to do what you got to do.

  • Migration of huge data from norm tables to denorm tables for performance

    We are planning to move the NORM tables to DENORM tables in Oracle DB for a client for performance issue. Any Idea on the design/approach we can use to migrate this HUGE data (2 billion records/ 5TB of data) in a window of 5 to 10 hrs (Or minimum than that also).
    We have developed SQL that is one single query which contains multiple instance of same table and lots of join. Will that be helpful.

    Jonathan Lewis wrote:
    Lother wrote:
    We are planning to move the NORM tables to DENORM tables in Oracle DB for a client for performance issue. Any Idea on the design/approach we can use to migrate this HUGE data (2 billion records/ 5TB of data) in a window of 5 to 10 hrs (Or minimum than that also).
    We have developed SQL that is one single query which contains multiple instance of same table and lots of join. Will that be helpful.Unfortunately, the fact that you have to ask these questions of the forum tells us that you don't have the skill to determine whether or not the exercise is needed at all. How have you proved that denormalisation is necessary (or even sufficient) to solve the client's performance problems if you have no idea about how to develop a mechanism to restructure the data efficiently ?
    Regards
    Jonathan LewisYour brutal honesty is certainly correct. Another thing that is concerning to me is that it's possible that he's planning on denormalizing tables that are normalized for a reason. What good is a system that responds like a data warehouse but has questionable data integrity? I didn't even know where to begin with asking that question though.

  • Can you help me about change data captures in 10.2.0.3

    Hi,
    I made research about Change Data Capture and I try to implement it between two databases for two small tables in 10g release 2.MY CDC implementation uses archive logs to replicate data.
    Change Data Capture Mode Asynchronous autolog archive mode..It works correctly( except for ddl).Now I have some questions about CDC implementation for large tables.
    I have one senario to implement but I do not find exactly how can I do it correctly.
    I have one table (name test) that consists of 100 000 000 rows , everyday 1 000 000 transections occurs on this table and I archive the old
    data more than one year manually.This table is in the source db.I want to replicate this table by using Change Data Capture to other stage database.
    There are some questions about my senario in the following.
    1.How can I make the first load operations? (test table has 100 000 000 rows in the source db)
    2.In CDC, it uses change table (name test_ch) it consists of extra rows related to opearations for stage table.But, I need the orjinal table (name test) for applicaton works in stage database.How can I move the data from change table (test_ch) to orjinal table (name test) in stage database? (I don't prefer to use view for test table)
    3.How can I remove some data from change table(name test_ch) in stage db?It cause problem or not?
    4.There is a way to replicate ddl operations between two database?
    5. How can I find the last applied log on stage db in CDC?How can I find archive gap between source db and stage db?
    6.How can I make the maintanence of change tables in stage db?

    Asynchronous CDC uses Streams to generate the change records. Basically, it is a pre-packaged DML Handler that converts the changes into inserts into the change table. You indicated that you want the changes to be written to the original table, which is the default behavior of Streams replication. That is why I recommended that you use Streams directly.
    <p>
    Yes, it is possible to capture changes from a production redo/archive log at another database. This capability is called "downstream" capture in the Streams manuals. You can configure this capability using the MAINTAIN_* procedures in DBMS_STREAMS_ADM package (where * is one of TABLES, SCHEMAS, or GLOBAL depending on the granularity of change capture).
    <p>
    A couple of tips for using these procedures for downstream capture:
    <br>1) Don't forget to set up log shipping to the downstream capture database. Log shipping is setup exactly the same way for Streams as for Data Guard. Instructions can be found in the Streams Replication Administrator's Guide. This configuration has probably already been done as part of your initial CDC setup.
    <br>2) Run the command at the database that will perform the downstream capture. This database can also be the destination (or target) database where the changes are to be applied.
    <br>3) Explicitly define the parameters capture_queue_name and apply_queue_name to be the same queue name. Example:
    <br>capture_queue_name=>'STRMADMIN.STREAMS_QUEUE'
    <br>apply_queue_name=>'STRMADMIN.STREAMS_QUEUE'

  • Problem with Master Data Load

    Dear Experts,
    If somebody can help me by the following case, please give me some solution. Iu2019m working in a project BI 7.0 were needed to delete master data for an InfoObject material. The way that I took for this was through tcode u201CS14u201D. After that, I have tried to load again the master data, but the process was broken and the load done to half data.
    This it is the error:
    Second attempt to write record 'YY99993' to /BIC/PYYYY00006 failed
    Message no. RSDMD218
    Diagnosis
    During the master data update, the master data tables are read to determine which records of the data package that was passed have to be inserted, updated, or modified. Some records are inserted in the master data table by a concurrently running request between reading the tables at the start of package processing and the actual record insertion at the end of package processing.
    The master data update tries to overwrite the records inserted by the concurrently running process, but the database record modification returns an unexpected error.
    Procedure
    u2022     Check if the values of the master data record with the key specified in this message are updated correctly.
    u2022     Run the RSRV master data test "Time Overlaps of Load Requests" and enter the current request to analyze which requests are running concurrently and may have affected the master data update process.
    u2022     Re-schedule the master data load process to avoid such situations in future.
    u2022     Read SAP note 668466 to get more information about master data update scheduling.
    Other hand, the SID table in the master data product is empty.
    Thanks for you well!
    Luis

    Dear Daya,
    Thank for your help, but I was applied your suggesting. I sent to OSS with the following details:
    We are on BI 7.0 (system ID DXX)
    While loading Master Data for infoobject XXXX00001 (main characteristic in our system u2013 like material) we are facing the following error:
    Yellow warning u201CSecond attempt to write record u20182.347.263u2019 to BIC/ XXXX00001 was successfulu201D
    We are loading the Master data from data source ZD_BW_XXXXXXX (from APO system) through the DTP ZD_BW_XXXXX / XXX130 -> XXXX00001
    The Master Data tables (S, P, X) are not updated properly.
    The following reparing actions have been taken so far:
    1.     Delete all related transactional and master data, by checking all relation (tcode SLG1 à RSDMD, MD_DEL)
    2.     Follow instructions from OSS 632931 (tcode RSRV)
    3.     Run report RSDMD_CHECKPRG_ALL from tcode SE38 (using both check and repair options).
    After deleting all data, the previous tests were ok, but once we load new master data, the same problem appears again, and the report RSDMD_CHECKPRG_ALL gives the following error.
    u201CCharacteristic XXXX00001: error fund during this test.u201D
    The RSRV check for u201CCompare sizes of P and X and/or Q and Y tables for characteristic XXXX00001u201D is shown below:
    Characteristic XXXX00001: Table /BIC/ PXXXX00001, /BIC/ XXXXX00001 are not consistent 351.196 derivation.
    It seems that our problem is described in OSS 1143433 (SP13), even if we already are in SP16.
    Could somebody please help us, and let us know how to solve the problem?
    Thank for all,
    Luis

  • Problem accessing basic data types

    Hi,
    I am a newbie at using JNI so please don't mind if I am asking something trivial.
    I have a JNI wrapper for a native C code. The C code is a Gtk+ application using GLib library. This library has it's own basic data types. For example, "gchar" corresponding to "char". I have generated the JNI Wrappers using the tool named "Swig" which is an interface between the C and other programming languages such as Java. What Swig has come up with is since "gchar" is not understood by it as "char" so it has taken the "gchar" as come Reference Type and generated another class for it. And instead of accepting simple char it is expecting a long.
    Even if I pass a numerical value like 11111 after instantiating this newly generated gchar class while running the program the JVM is crashing complaining SIGSEGV recieved from the underlying libraries.
    I am confused first of all since the error is not understood and secondly how can I tell the JNI that gchar is similar to char.
    What approach should I follow to solve the problem that I am facing? Any feedback on this will be appreciated.
    Thanks & Regards

    At run time you can see all the data........ like what i have shown...
    but if you clearly see, DATE will be in the internal format..but if you print it, it will be in dd:mm:yyyy
    can you suggest me if i have a dynamic field symbol (table data) ,,,, How can i convert data types dynamically..
    if it is a static internal table i am achieving with WRITE TO statement.....but i have huge data in field symbols...
    Instead of all these , please specify the exact problem your are facing . What is it with date field ? . In SAP while printing the internal format will be converted to external. What is your requirement with this date field?
    My output looks some thing like this:
    04 36876 15.09.2011 39600 1999
    06 36960 15.09.2011 39600 2632
    07 36874 15.09.2011 39541 9232
    My expected output
    04 36.876 15.09.2011 39.600 1.999
    06 36.960 15.09.2011 39.600 2.632
    07 36.874 15.09.2011 39.541 9.232
    I dont see any problems mentioned in your date field. Both your actual and expected outputs reflects the same in date field.
    In SCN you will only get solutions if your question is precise.
    Kesav

  • A problem about calling Labview vi in VB

    Hi all:
    I meeting a problem about data transfer and parallel operation between VB and Labview.
    Actually, I want develop a VB program, in which, the Labview VI can be called and corresponding parameters can be transferred to Labview. and then, I also can operate my system by VB program at same time. something like parallel operation (VB and Labview program).
     But the question is :
    1.   If I use "Call" method of ActiveX in VB,  and the LabVIEW subvi is not stopped (for example, a loop structure), I can not do  parallel operation on VB program. The error message is "other application is busy" which is attached below. The sample codes is also attached.
    2.   I tried to use other methods like "OpenFrontPanel" and "Run", but I am not sure how to transfer the parameter??
    3.  Then I tried to use "SetControlValue" to set the parameters, but there is a error " := expected", which is very strange, because the statement  I wrote is followed with the help documents [ eg: VI.SetControlValue ("string", value)], why it is still need a "=" ??
    Does anybody know something about it? Thanks a lot
    Message Edited by hanwei on 11-07-2008 03:18 PM
    Attachments:
    vb_labview_error_message_1.JPG ‏14 KB
    VB_to_LV.zip ‏10 KB

    I sure hope OP has solved it by now.
    /Y
    LabVIEW 8.2 - 2014
    "Only dead fish swim downstream" - "My life for Kudos!" - "Dumb people repeat old mistakes - smart ones create new ones."
    G# - Free award winning reference based OOP for LV

  • Problem with billing date at closed period

    Hi
    We have a problem with billing date in transaction . Previously we could set up different (earlier) billing date than the actual posting period and now it is not working. We receive an error message that the posting period is closed ) .
    I attached an example invoice which we managed to post and where you can see what Iu2019m thinking about, parking number: 1024337, BILLING DOCUMENT: 841835. In this case the billing date was at 30.11.2007, but the posting in 15.07.2008 and we managed to post.
    Pls help that issue
    yps

    hi,
    it is due to posting period closed for the month of 07/2008.
    u need to open posting during that period.
    Regards,
    Greeshma

  • Question about tranferring data from iPhone 3gs to iPhone 4

    I just had a couple quick questions about transferring data from my old phone from my new iPhone 4. The reason i am wondering is because i am worried about whether i will encounter any problems when doing so.
    First off i have already sold my phone today, i reset all data and settings from the phone and gave it to my buddy so its gone. I did a full sync and backup yesterday so all the necessary files should be on my computer(windows 7). Now, im basically wondering if i will run into any problems if i restore my iphone 4 from a backup. My 3gs was running 3.1.2 on att. Now i know IDEALLY i would have updated it to iOS 4 before backing it up and used the newest version of itunes, but i did not. Does anyone think this will be a problem for me?
    Now with that out of the way, my biggest fear is losing my old data(text messages and notes mainly because i am a pack rat for those type of things) so id like to be SURE that none of my old backups will be deleted in any scenario. The reason i dont just restore it right now is because i want my new phone to be as clutter free as possible. I am going to be putting on here only the apps that i used often and would basically like to transfer over the BARE minimum; texts, notes, and highly used apps... So i guess my main question is can you transfer over only certain things like texts and notes after setting up the phone as a new phone. And if i were to set up the phone as a new phone what would happen to my old backups? Would i be able to selectively restore?
    Im afraid that it might not be a possibility to transfer only certain things even though it should be.. i should be able to select a text messages folder and put it on my new phone and be done with it... But anyway i dont want to rant. Can anyone explain to me how this all will work?
    ULTIMATE GOAL: Transfer only texts, notes, certain apps(and their data) and NOTHING ELSE.
    MOST IMPORTANT THING: Not losing texts and notes. I can deal with putting all the old **** on my new phone and cluttering/slowing it down if i NEED to.
    Thank you in advance, sorry for the long post.

    If the most important thing for you is keeping old text messages, notes, and voicemail, then you'll need to sync the phone from your existing backup. I know of no other way to access those items.
    Once you have synced to the new phone, check that you have those items that were important. Then you can reconnect your phone to iTunes, and change the sync settings to remove the apps or other items you no longer want to keep on the phone.
    iPhone backups are stored by iTunes; you can see them by opening your iTunes preferences, clicking on "Devices" and then looking in the window. You can delete old backups from here. I don't know how you can open/read the backups though.
    I don't expect you'd have any problems syncing from your old phone's backup, but it's definitely an either/or situation. Since you got rid of the old phone already, it's too late to email yourself your notes, or copy the text messages. Your previous backup is your only solution.

Maybe you are looking for

  • Had to format PC, how do I get my apps back?

    So as any windows user knows, every so often we have to format our PC, and I just had to do that the other day. After the machine was formatted, I started installing my programs again, and I just finished with iTunes. I have all my music and movies b

  • Ipod connection to 2014  Toyota Camry

    Having issues connecting Ipod touch (1st gen) to 2014 camry via usb.  supposed to be compatible, says "ipod authorization unsucessful" , any ideas?

  • Need DataSource names for R/3 tables

    Hi All, I need the data Source names for the below 6 R/3 tables, EKKO , EKPO, T024 , T052U, LFA1 , LFB1 This could be helpful for me to do a LO Extraction. Thanks in advance, Arun.M.D

  • Fatal msi error mp.msi could not be installed

    Good morning, I have SCCM 2012 R2 running on Server 2012 R2.  I've got a primary site and two secondary sites, all of them with identical hardware and software.  I've set them all up the exact same way, but on one of my secondary servers I can not ge

  • Currency signs

    This may be a stupid question, but does anyone know how to make a currency sign visible in forms and documents rather than the type. e.g. I would like to display £ not GBP Many thanks Dom