Is this too much of data validtion?

HI, All:
I am doing one_to_one data migration from SQLSERVER to Oracle dev database. No thing fancy involved, no function, no calculation, no where clause, no join etc. All I do is using an ETL tool to move the data as is from Sqlserver to Oracle. Lots of tables, most of them have less them hundred records. And ETL tool is very good, tells you how many is in source table and how many rows get into target table. or errors if there is any. Usually, when there is error, data won't get loaded into target table.
After I finish migration, I just did some count(*), and randomly check some records and done with it.
But my manager asks me to do following for each table:
Summing up some of the numeric fields on each side to see if the totals match
Counting occurrences of some specific dates on date columns
Performing count(distinct()) on some fields to compare values
Perform a min/max on some fields and compare values
Perform some of the above on tables that are normally joined
Other things that may be specific to the table and columns that have been brought over
I really don't think that's necessary. It's a simple data move with a commercial ETL tool, no any manipulation in between. If there is no error, a quick check is enough. I don't see any possibility that data may get messed up during the migration that requires such detailed validation. I am thinking by his over-cautiousness, I should do all this thorough checking even after each export and import.....
Any one agree with me?

Personally I think it's a stupid idea. It's only going to show you any differences that you happen to spot.
If there is some reason to suspect that the ETL tool might not be transactionally sound and insufficiently instrumented to flag up any potential issues then I'd suggest it was a poor choice of tool in the first place!
Personally I would used the native distributed heterogeneous database support provided by default for free with Oracle.
Now, since there is data type mapping involved in migrating from SQL Server to Oracle, and since there is clearly no management buy in to the integrity of the tool, I'd think it was now necessary (if this database is of any importance) to check for any and all discrepancies.
To do this just create database link from Oracle to SQL Server and write some infrastructure that will show you all differences (e.g. MINUS views)
This will be trivial to write with the small exception that you must formally decide how to map from (say) an Oracle date to a SQL Server date/SQL Server Text columns to Oracle (Varchar2/CLOBs), how string trimming is expected etc.
It's very easy to do since it is iterative, and you will see each difference dissolve away.
This will be a worth while exercise.
At the end you can show your boss living proof that there are 100% no un-known differences, and then at the same time compile a list of all known differences (datatype maps) so that these can be used in impact analysis of the application, so that changes can be made along with the obviously required change in the transaction model of the applications.

Similar Messages

  • Yosemite OS is consuming too much internet data in the background

    Yosemite OS is consuming too much internet data in the background for no apparent reason. It's draining my monthly internet quota. It's important to note that iCloud Drive and photo stream are disabled. Anyone knows how to sove this problem? Otherwise, if there's no solution I think I'll downgrade to Mavericks.

    I'm wondering if you ever resolved this issue. My internet access is via satellite with highly allocated data usage. Ever since updating to Yosemite, my usage has gone through the roof. I've disable automatic downloads and any settings (including iCloud photo stream) that might receive or send data, plus I never use Spotlight. Data usage goes to zero when I shut off WiFi, so it seems obvious that something running in the background is the culprit. Apple is looking into this, with no results so far - meanwhile, I've lost hours of time playing detective, and hoping my ISP won't cut me off. No fun!
    Thanks for any ideas you have...

  • Too much meta data when importing to Flickr

    I've noticed that when I export a photo directly to Flickr it seems to carry with it a TON of useless meta data. It happens when I use the Flickr export plug-in as well as updating to Flickr from their site. Here is an example of too much data (basically everything after Compression):
    http://www.flickr.com/photo_exif.gne?id=108281029&context=set-1648131
    If I retouch the photo in Photoshop then everything below image length is not include (which is my preference). Here is an example:
    http://www.flickr.com/photo_exif.gne?id=111724050&context=set-1648131
    Has anyone else experienced this? Has anyone been able to change this function? Thanks!

    I have not used my EOS-M for video but the GoPro 3+ Black files import with all Metadata intact.

  • Is this too much chained rows ? How to prevent chained rows ?

    Hi,
    Due to performance issue on my database, I came across "Chained/migrated Rows articles" ... and ran script to check chained rows .....
    I have chained rows in 2 tables but only one is worth mention. It is a table that has 50 CLOB columns and has 1.1mil records .....
    After running the script for chained rows I get 500.000 chained rows out of this 1.1mil ....
    I will now do as explaind in the forums and books, reinsert this rows ..... to try fix this
    So my question would be, what do i need to do to prevent , if I actually can anything at all to not get so many chained rows ? I understand that some rows can't be prevented to have chains ...
    Database block is 8192 ..... Avarage row length(stats) of this table is 6093, est.size 8.9G .... PCTFree is 10 by default ...
    At this moment i'm getting warning :"PCTFREE too low for a table" and is at 1.3...
    Do I need to increase database block and/or increase PCTFree to some range between 20-25? If yes, can i somehow increase block only on this table cause recreating database that is 79GB would take some time ...?
    Performance is big issue, disk space is not ...
    Thank you.
    Kris

    user10702996 wrote:
    The whole insert row contains data about one newspaper article ..... So what we did for better search performances is to "cache" every word from this article into defined CLOB column but ordered by first character ... so words staring with A are in CHAR_1 clob column B is in CHAR_B and so on ....
    How are you querying the data ?
    From your description, it looks as if you need to look at Oracle's "text" indexing - I am basing this comment on the assumption that you are trying to do things like: "find all articles that reference aardvarks and zebras", and turning this into a search of the "A lob" and the "Z lob" of every row+ in the table. (I'm guessing that your biggest performance problem is actually the need to examine every row, rather than the problem of chained rows - obviously I may be wrong).
    If you use context (or intermedia, the name changes with version) you need only store the news item once as a LOB then create a text index on it - leaving Oracle to build supporting structures that allow you to run such queries fairly efficiently.
    Regards
    Jonathan Lewis
    http://jonathanlewis.wordpress.com
    http://www.jlcomp.demon.co.uk
    To post code, statspack/AWR report, execution plans or trace files, start and end the section with the tag {noformat}{noformat} (lowercase, curly brackets, no spaces) so that the text appears in fixed format.
    "Science is more than a body of knowledge; it is a way of thinking"
    Carl Sagan                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Serial-Buffer overflow without too much incoming data or bursts on the line

    My application runs under LabView 5.11 on a W2000 platform.
    After a couple of hours trouble-free working, the sub-vi "bytes at serial port" gives out 57351 Bytes. That is 56k + 7Bytes, which is the normal Datablock coming in at the port.
    Effect is that the "serial port read-vi" and as result the complete application hangs up.
    I have allready found out that this data set does not arrive at the port.
    Could this be a problem of LabView 5.11 running under W2000?

    I would first urge you to disconnect the cable from the serial port and let it run more than long enough to convince yourself the error goes away or re-occurs.
    If you disconnect, and the error still occurs, your port hardware may be bad. Try another port or PC.
    If the problem goes away, you could have noise on the line (interference, bad ground, bad conections, excesive long cable, etc, etc) or,
    your code is not coming back to read the port often enough. If you think this is the case, read from the port more often or use hardware handshaking.
    I would like to know a lot more before saying it is a software in-compatability issue.
    Keep us updated,
    Ben
    (An LOTR fan)
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Itunes Library Request: Apple, Is This Too Much To Ask??

    Yes, I know there are other threads on the subject of Library placement, etc. But they tend to digress, and I believe my question is very simple. Here goes:
    All of my music is presently on an external USB/FW drive (a 75gb partition of a 320gb drive, properly partitioned formatted as "MSDOS" with XP and OS X, so that OS X can read/write).
    I want to establish my INITIAL iTunes Music Library on this external drive. iTunes v7.3(54) is installed -- BUT with no library yet. I also don't yet have either of the iTunes "library files,".itl or .xml".
    All I want is to make this 15-20gb of music on the external drive become my iTunes library. And I am willing to have to connect the external in order to play the music.
    Do I have to do perform any iTunes "gymnastics" to do this? I'm perfectly willing to create a new folder on the Ext Drive if iTune HAS to move the files (???) and let the files "moooove". But this can't be hard, can it??
    Any help?
    Many thanks in advance,
    Ken S

    What you need to do is delete the "iTunes" folder in
    your Music directory in Home, then create the
    "iTunes" folder on your external drive.
    Now, make an alias of the "iTunes" folder on the
    external drive, and copy it to the "Music" folder
    under you Home directory. Voila.
    WebCrasher-
    First, THANK YOU! Frustrated when my question sat idle for two days, my subsequent sarcastic post lacked civility. But you still have the class to offer a serious technical response to my question (I fully expected to be flamed for my last post). Appeciated.
    I'd just about decided to create the iTunes Music folder on the external drive, but hadn't considered placing an alias to the external data in a local folder (which I assume I would create after iTunes added all the music to its library, having first deleted the same folder?).
    Two questions on your proposed solution:
    1. Would I then have to re-SET my library location to my home folder (under Preferences|Advanced->SET)? And if so, would I also then have to "consolidate" (i.e., pretend to combine the library) using "Advanced|Consolidate Library" (from the GUI menu)? Can iTunes really be happy adding music, etc. to a "pseudo-Library" (the aliased folder) in my home directory?
    and
    2. Is there a significant advantage to this technique over simply letting iTunes manage it's Library on the external drive, as long as its two DB files - .itl and .xml - remain local? In either case I'd have to have the USB/FW external drive connected to the PowerBook to use iTunes, correct?
    TIA,
    Ken S

  • Too much 'other' data how do i get rid of it?

    I have just tried to sync my iPhone 5 32gb and it has suddenly got 24gb of 'other' leaving my 8gb over the limit.should I restore it?
    Thanks

    Other that large is corrupt data. To fix, you'll have to restore the phone. You can first try restoring from backup, followed by syncing your content back to the phone. This sometimes fixes the issue, but if it doesn't, you'll have to follow the instructions here to restore as a new device:
    http://support.apple.com/kb/ht1414

  • Droid Turbo using way too much mobile data

    My Droid Turbo has used almost all of 2gb in the first month of mobile data. I am on WiFi 95% of the time and all I use mobile data for is google maps.
    Anyone else see that the Droid Turbo is calculating WiFi data a mobile data?

    You can also disable some of the more obscure apps that you do not use.  Big Data users are My Verizon, Verizon Cloud (use Google instead). There are quite a few and a list can be obtained as to which are safe to disable or not.  Then make sure your phone is set to only update over Wifi as well.  A lot of data usage is due to app updates.

  • HT4858 Photos take up too much iCloud data

    I've deleted most photos from my Camera Roll and my Photo Stream, I only have 114 photos altogether now, but the photos are still taking up 4.5GB of my iCloud data, can anyone explain why?

    The only photos stored on icloud are...
    1) photos in the camera roll that are included in a backup to icloud.  But you can turn off backups of such photos.
    Settings>icloud>Storage & Backups>Manage Storage,  tap the device's name and on the next screen, be sure Camera Roll is turned OFF if you don't want to store photos in a backup.
    2) Photos stored in photo stream.  But these do not count against your storage, so it doesn't matter.
    That's it.  There are no other photos on icloud that you can remove.

  • Is this video too much for my G4 Powerbook?

    Hi,
    I have downloaded a video from Vimeo and it won't play. At best it gets very choppy, even the audio, to the point where it just stalls.
    I have a G4 powerbook, 15"
    1.6GHz
    2GB RAM
    250GB hardrive (165GB free)
    OSX 10.5.6
    Quicktime Pro 7.6
    The video is;
    H.264 Decoder
    1280x720
    Millions
    AAC
    Data size 52.99MB
    Data rate 6233.57kbits/s
    Is this too much for my good old Mac?

    Thanks Kirk, what a shame. I'd love a new Macbook Pro but I always said I would use my G4 till it fell to pieces, never thinking it would fail me in such a unremarkable way.
    Might have to wait till the end of the summer though.....
    Would you mind educating me somewhat. I wasn't sure from the details given in the Quicktime movie inspector whether the video was HD or not. The creator of the video shot it on a Canon Rebel XT SLR camera, in burst mode. That itself made me doubt it was HD, but Vimeo deals in HD footage so I just didn't know.
    How did you deduce it to be HD, and what should I be looking for? How much data could my G4 handle?
    Thanks again Kirk!

  • IChat "AIM server has temporarily limited your account due to too much activity."

    I can't long on to iChat even after chaning my port 443. Other people here at work have been reported the same problem (three others that I know of). I've seen old post from stating the same problem with some resolve by just waiting for a couple days. Any know fixes by now?
    I will add this: Yesterday I was sent a spam chat from a coworker and clicked on it. The link launched a web page but I didn't interact with it. Turns out, this coworker's AIM account was hacked. He's account is fine today but two other coworkers who were also sent the spam are in the same boat as me. However, a third person with this "too much activity" problem was not sent the spam. However, I did interact with him on iChat yesterday. Could I now be infected with a iChat virus or was my account hacked? I don't really know how to check.

    Hi,
    AIM do suspend accounts for various reasons.
    One reason is for being abusive in Chatrooms and other people complaining.
    You probably need to start here
    8:42 PM      Wednesday; November 9, 2011
    Please, if posting Logs, do not post any Log info after the line "Binary Images for iChat"
     G4/1GhzDual MDD (Leopard 10.5.8)
     MacBookPro 2Gb( 10.6.8)
     Mac OS X (10.6.8),
    "Limit the Logs to the Bits above Binary Images."  No, Seriously

  • For the last month or so my icloud won't back up, it keeps telling me that there is too much storage and I need to delete some data. But I have deleted between 3-400 photos and it is still saying the same, can anyone please help

    For the last month or so my icloud won't back up, it keeps telling me that there is too much storage and I need to delete some data. But I have deleted between 3-400 photos and it is still saying the same, can anyone please help

    If they were deleted from your camera roll that should have resulted in a corresponding decrease in the estimated size of your next backup.
    If it's still saying that you don't have enough storage to back up, read through this article: http://support.apple.com/kb/ht4847.  It provides some suggestions for reducing your iCloud storage, such as deleting unneeded email. 
    The other area to look at is your text messages.  These are included in the backup.  If you have lots of photos and videos attached to your messages, that can significantly increase the size of your backup.  If that's the case, deleting these may make a big difference.

  • I don't want to write too much code is there a different way of doing this

    I am writing a precedure to check on the max test scores for different codes ('S01','S02'.S03') --there are more
    then I need to insert the table if the record with the best score does not exists for example for b.sortest_tesc_code = 'BSV', I am writing a cursor
    for each code (.sortest_tesc_code = 'S01') is there a way to do this different? so I cant do something like a.sortest_tesc_code in ('S01','S02'.S03') and store in a
    variable then insert, the problem is that you can have a student that have only one test other that have two etc..etc.. is not consistent, also If the b.sortest_tesc_code = 'BSV') is already in the table I don't do an insert I will have to do an update if the sortest_test_score is greater since the student can submit scores more than once... In another words check if the record exists( b.sortest_tesc_code = 'BSV') if is there compare with the new max score and if the new max score is greater then update.. If the score (by code) is not in the table insert
    Hope this is clear, this is what I have, I now it will work but it will be too much code..check for exists and not exists in two different precedures..
    Thank you
    CURSOR get_the_max_scores_S01_cur IS
                SELECT
                 sortest_pidm, a.sortest_test_score, a.sortest_tesc_code,
                 a.sortest_test_date,a.sortest_equiv_ind
                FROM
                saturn.spriden, saturn.sortest a, saturn.stvtesc
               WHERE 
               a.sortest_pidm = spriden_pidm
              AND stvtesc_code = a.sortest_tesc_code
              AND spriden_change_ind IS NULL
           -----and   a.sortest_tesc_code in ('S01','S02'.S03')
           AND a.sortest_tesc_code = 'S01'
           --and spriden_id = p_student_id  --
           ---for test purposes
           AND sortest_pidm = 133999 ----THE WILL BE A PARAMETER
           AND a.sortest_test_score =
                  (SELECT MAX (b.sortest_test_score)
                     FROM saturn.sortest b
                    WHERE a.sortest_tesc_code = b.sortest_tesc_code
                          AND a.sortest_pidm = b.sortest_pidm)
                                AND NOT EXISTS
                  (SELECT 1   FROM    saturn.sortest b
                  WHERE    A.sortest_tesc_code = b.sortest_tesc_code
                          AND a.sortest_pidm = b.sortest_pidm     
                          and   b.sortest_tesc_code = 'BSV');
         BEGIN
                     UTL_FILE.fclose_all;
                     v_file_handle := UTL_FILE.fopen (v_out_path, v_out_file, 'a');
                    UTL_FILE.put_line (v_file_handle,
                          CHR (10) || TO_CHAR (SYSDATE, 'DD-MON-YYYY HH:MI:SS'));
                   UTL_FILE.put_line (v_file_handle, 'sortest_best_sct_scorest');
                   --check for an open cursor before opening
                   IF get_the_max_scores_S01_cur%ISOPEN
                       THEN
                        CLOSE get_the_max_scores_S01_cur;
                   END IF;
                OPEN get_the_max_scores_S01_cur;
                 LOOP
                       FETCH get_the_max_scores_S01_cur
                       INTO v_pidm, v_tscore, v_testcode,
                               v_test_date, v_equiv_ind;
                       EXIT WHEN get_the_max_scores_S01_cur%NOTFOUND;
                   IF  get_the_max_scores_S01_cur%FOUND 
                    THEN
                       INSERT INTO saturn.sortest
                       (sortest_pidm,sortest_tesc_code,sortest_test_date,sortest_test_score,
                        sortest_activity_date,sortest_equiv_ind,sortest_user_id,sortest_data_origin)
                        VALUES
                        v_pidm,
                       'BSV',
                        v_test_date,
                         v_tscore,
                         sysdate, 
                        v_equiv_ind,
                        p_user,
                        'best_test_scores process'
                   END IF;    
                   END LOOP;
                   COMMIT;
                   ---Initialize variables
                    v_pidm := NULL;
                    v_test_date := NULL; 
                    v_tscore  := NULL; 
                    v_equiv_ind :=  NULL;
                    v_testcode  :=  NULL;
                 CLOSE get_the_max_scores_S01_cur;
    ----then another do the same for S02...S03

    Thank you, here is the code, I change the name of the tables, but it is the same concept.what I need is to extract the max score for each code (s01,s02,s03,s04)
    then insert a record with a different code in the same table
    BSM     Best Math SAT (S01)                              
    BSW     Best writing SAT (S04)     
    BSC     Best READING SAT (S03)     
    BSE     Best READING SAT (S02)     
    I need to be able to check if the BS codes are already in the table (BSM...BSC..) IF they are not do an insert and if they are do an update get the maximun score
    again (the students can submit more than one score form the same code and any date) and if the maximun is different(greater) of what is already in the database (with the BSM...BSC.. codes) do an update, IF NOT if is the same or less don't update...
    I need the PERSON table because I need to use the ID as a parameter they (user) can run the process for one ID or all the records in the table TEST
    Thank you, I hope is clear
    create table TEST
    TEST_PIDM                  NUMBER(8)            NOT NULL,
    TEST_TESC_CODE        VARCHAR2(4 CHAR)     NOT NULL,
    TEST_TEST_DATE        DATE                 NOT NULL,
    TEST_TEST_SCORE       VARCHAR2(5 CHAR)     NOT NULL,
    TEST_ACTIVITY_DATE    DATE                 NOT NULL,
    TEST_EQUIV_IND        VARCHAR2(1 CHAR)     NOT NULL
    INSERT INTO TEST
    ( TEST_PIDM, TEST_TESC_CODE,TEST_TEST_DATE, TEST_TEST_SCORE, TEST_ACTIVITY_DATE,TEST_EQUIV_IND)
    SELECT
    128019,'EB' ,TO_DATE( '01-JUN-2004', 'DD-MON-YYYY'),'710',SYSDATE,'N'
    FROM DUAL; 
    INSERT INTO TEST
    ( TEST_PIDM, TEST_TESC_CODE,TEST_TEST_DATE, TEST_TEST_SCORE, TEST_ACTIVITY_DATE,TEST_EQUIV_IND)
    SELECT
    128019,'M2' ,TO_DATE( '01-JUN-2005', 'DD-MON-YYYY'),'710',SYSDATE,'N'
    FROM DUAL; 
    INSERT INTO TEST
    ( TEST_PIDM, TEST_TESC_CODE,TEST_TEST_DATE, TEST_TEST_SCORE, TEST_ACTIVITY_DATE,TEST_EQUIV_IND)
    SELECT
    128019,'S01' ,TO_DATE( '01-JUN-2005', 'DD-MON-YYYY'),'750',SYSDATE,'N'
    FROM DUAL; 
    INSERT INTO TEST
    ( TEST_PIDM, TEST_TESC_CODE,TEST_TEST_DATE, TEST_TEST_SCORE, TEST_ACTIVITY_DATE,TEST_EQUIV_IND)
    SELECT
    128019,'S01' ,TO_DATE( '01-JUN-2005', 'DD-MON-YYYY'),'720',SYSDATE,'N'
    FROM DUAL; 
    INSERT INTO TEST
    ( TEST_PIDM, TEST_TESC_CODE,TEST_TEST_DATE, TEST_TEST_SCORE, TEST_ACTIVITY_DATE,TEST_EQUIV_IND)
    SELECT
    128019,'S02' ,TO_DATE( '01-JUN-2005', 'DD-MON-YYYY'),'740',SYSDATE,'N'
    FROM DUAL; 
    INSERT INTO TEST
    ( TEST_PIDM, TEST_TESC_CODE,TEST_TEST_DATE, TEST_TEST_SCORE, TEST_ACTIVITY_DATE,TEST_EQUIV_IND)
    SELECT
    128019,'S02' ,TO_DATE( '05-JUL-2005', 'DD-MON-YYYY'),'730',SYSDATE,'N'
    FROM DUAL ;
    INSERT INTO TEST
    ( TEST_PIDM, TEST_TESC_CODE,TEST_TEST_DATE, TEST_TEST_SCORE, TEST_ACTIVITY_DATE,TEST_EQUIV_IND)
    SELECT
    128019,'S03' ,TO_DATE( '01-JUN-2005', 'DD-MON-YYYY'),'780',SYSDATE,'N'
    FROM DUAL; 
    INSERT INTO TEST
    ( TEST_PIDM, TEST_TESC_CODE,TEST_TEST_DATE, TEST_TEST_SCORE, TEST_ACTIVITY_DATE,TEST_EQUIV_IND)
    SELECT
    128019,'S03' ,TO_DATE( '05-JUL-2005', 'DD-MON-YYYY'),'740',SYSDATE,'N'
    FROM DUAL; 
    INSERT INTO TEST
    ( TEST_PIDM, TEST_TESC_CODE,TEST_TEST_DATE, TEST_TEST_SCORE, TEST_ACTIVITY_DATE,TEST_EQUIV_IND)
    SELECT
    128019,'S04' ,TO_DATE( '01-JUN-2005', 'DD-MON-YYYY'),'770',SYSDATE,'N'
    FROM DUAL; 
    INSERT INTO TEST
    ( TEST_PIDM, TEST_TESC_CODE,TEST_TEST_DATE, TEST_TEST_SCORE, TEST_ACTIVITY_DATE,TEST_EQUIV_IND)
    SELECT
    128019,'S04' ,TO_DATE( '05-JUL-2005', 'DD-MON-YYYY'),'740',SYSDATE,'N'
    FROM DUAL; 
    CREATE TABLE PERSON
      PERSON_PIDM                NUMBER(8)         NOT NULL,
      PERSON_ID                  VARCHAR2(9 CHAR)  NOT NULL
    INSERT INTO  PERSON
    ( PERSON_PIDM ,   PERSON_ID)
    SELECT
    128019,'003334556'
    FROM DUAL ;
    CREATE TABLE VALTSC
    VALTSC_CODE             VARCHAR2(4 CHAR)     NOT NULL,
      VALTSC_DESC             VARCHAR2(30 CHAR)
    INSERT INTO  VALTSC
    VALTSC_CODE,
      VALTSC_DESC 
    SELECT
    'S01' ,
    'XXS01'
    FROM DUAL; 
      INSERT INTO  VALTSC
    VALTSC_CODE,
      VALTSC_DESC 
    SELECT
    'S02' ,
    'XXS02'
    FROM DUAL 
      INSERT INTO  VALTSC
    VALTSC_CODE,
      VALTSC_DESC 
    SELECT
    'S03' ,
    'XXS03'
    FROM DUAL; 
    INSERT INTO  VALTSC
    VALTSC_CODE,
      VALTSC_DESC 
    SELECT
    'S04' ,
    'XXS04'
    FROM DUAL; 

  • Data extracting to BW from R3 taking too much time

    Hi,
    We have one delta data load to ODS from R3 this is taking 4-5 hours .this job runs in r3 itself for 4-5 hours even for 30-40 records.and after this ODS data updated to cube so but since in ODS itself takes too much time so delta brings 0 records in cube hence we have to update manually.
    Also as now job is running for load to ODS so can't we check records for delta in RSA3 Its giving me error saying  "error occurs during extraction ".
    can u please guide how we can make this loading faster if any index needs to be build how to proceed on that front
    Thanks
    Nilesh

    rAHUL,
    I tried with R its giving me dump with message "Resul of customer enhancemnet 19571 records"
    Erro details are -
    Short text
        Function module " " not found.
    What happened?
        The function module " " is called,
        but cannot be found in the library.
        Error in the ABAP Application Program
        The current ABAP program "SAPLRSA3" had to be terminated because
        come across a statement that unfortunately cannot be executed.
    What can you do?
        Note down which actions and inputs caused the error.
        To process the problem further, contact you SAP system
        administrator.
        Using Transaction ST22 for ABAP Dump Analysis, you can look
        at and manage termination messages, and you can also
        keep them for a long time.

  • Query taking too much time with dates??

    hello folks,
    I am trying pull some data using the date condition and for somereason its taking too much time to return the data
       and trunc(al.activity_date) = TRUNC (SYSDATE, 'DD') - 1     --If i use this its takes too much time
      and al.activity_date >= to_date('20101123 000000', 'YYYYMMDD HH24MISS')
       and al.activity_date <= to_date('20101123 235959', 'YYYYMMDD HH24MISS') -- If i use this it returns the data in a second. why is that?
    How do i get the previous day without using the hardcoded to_date('20101123 000000', 'YYYYMMDD HH24MISS'), if i need to retrieve it faster??

    Presumably you've got an index on activity_date.
    If you apply a function like TRUNC to activity_date, you can no longer use the index.
    Post execution plans to verify.
    and al.activity_date >= TRUNC (SYSDATE, 'DD') - 1
    and al.activity_date < TRUNC (SYSDATE, 'DD')

Maybe you are looking for

  • MacBook Pro power nap

    i have a MacBook Pro 15-inch Late 2011 (mountain lion) and wondered whether it can run power nap

  • Word to the wise: title 3d and autorender function

    If you have the auto-render function turned on, open title 3-d and walk away for lunch, you'll freeze FCP. You get the little "Auto render complete" box, but you cannot click OK because it comes from FCP, which becomes idle when you open 3d. Possible

  • Order calculating attribute dimension

    Hi all,I have an outine with 4 dimensions measure ( dimension tagged as accounts and desnse )Scenario (dimension tagged as dense)Year (dimension tagged as sparse) (Y02,Y03, Y04)store (dimension tagged as sparse)society (attribute dimension) ( soc1,so

  • Retain last state

    I have a little accordion sidebar my temporary site I have customized and widget works with mouseover, then click through to new page. Problem I have is once I'm on panel number 2, If I click on a panel content link, when new page loads the accordion

  • Will ACR 5.6 RC work with Bridge CS3

    I have gotten the RC to install for Elements  6.0 on OS X.  Bridge CS3 however either displays very fuzzy previews if you use thumbnails generated from 5.5 or if you purge the cache, will not display any thumbnails for .CR2 files created with a 7D or