DTW BP Catalog numbers many records

Hi Experts,
I have a business scenario where the client has around 500 customers and around 50 Vendors. Each vendor has its own item list with specific BP Catalog numbers and price associated with the items supplied by that vendor. In order to maintain the difference in the item code from what is referred to by the vendors and customers in particular we are using BP catalog functionality of SAP Business One. I am aware of the DTW import for the BP catalogs.
The issue is, assuming there are around 1000 item codes supplied by different vendors, some of them may be common . If we make a BP Catalog import file for each customer for all the items in order to ease ourselves out of the effort of analysing the item codes that have different BP catalog numbers as compared to the item codes, the total records to be imported would come out to be at least 500,000  (500 X 1000). Doing a DTW import for this would take hours. Its running for 3 hours and still hasn't shown any progress bar yet where i am tring to import it from a database table rather than a CSV.
Could someone recommend a better out of the box solution for this scenario?
Thanks and regards
Amit Arora

Hi Amit..........
While executing any query or any FMS or any Data Import SAP first checks each and every permutation  and combination whether it can be done or not. If it finds any data missing then it gives error. So in your case as it has more permutation and combination SAP will obviously take time to check the availability of each record for every performed actions.
If you want it to be completed as much as fast it can be then you may have to develop your own utility.......
It might be a program or some back end function....
Regards,
Rahul

Similar Messages

  • How many numbers  of records can be uploaded  using BDC  at a time

    dear friends,
                      i want to know that how many numbers  of records can be uploaded using BDC  at a time from legacy system(database).

    no resticsation.
    large value of data  upload.

  • Deleting BP Catalog Numbers

    Hi,
    Is there a way to delete all existing BP catalog numbers at once? I mistakenly globally updated one BP's catalog to the entire BP list and now want start over from scratch. Thanks.

    Hi Bishal- I know that I can update and overwrite using DTW, but in this case i want to actually do a bulk removal, which is not possible with DTW. It is possible with SQL, but of course that is a no no from SAP.
    Hi Jitin- I know I am able to remove the BP catalogs one by one from within the B1 interface, but because I accidentally created 800+ catalogs, I'm looking for a faster way to clear all records.
    Any other ideas? Thanks a lot.

  • Fetching many records all at once is no faster than fetching one at a time

    Hello,
    I am having a problem getting NI-Scope to perform adequately for my application.  I am sorry for the long post, but I have been going around and around with an NI engineer through email and I need some other input.
    I have the following software and equipment:
    LabView 8.5
    NI-Scope 3.4
    PXI-1033 chassis
    PXI-5105 digitizer card
    DELL Latitude D830 notebook computer with 4 GB RAM.
    I tested the transfer speed of my connection to the PXI-1033 chassis using the niScope Stream to Memory Maximum Transfer Rate.vi found here:
    http://zone.ni.com/devzone/cda/epd/p/id/5273.  The result was 101 MB/s.
    I am trying to set up a system whereby I can press the start button and acquire short waveforms which are individually triggered.  I wish to acquire these individually triggered waveforms indefinitely.  Furthermore, I wish to maximize the rate at which the triggers occur.   In the limiting case where I acquire records of one sample, the record size in memory is 512 bytes (Using the formula to calculate 'Allocated Onboard Memory per Record' found in the NI PXI/PCI-5105 Specifications under the heading 'Waveform Specifications' pg. 16.).  The PXI-5105 trigger re-arms in about 2 microseconds (500kHz), so to trigger at that rate indefinetely I would need a transfer speed of at least 256 Mb/s.  So clearly, in this case the limiting factor for increasing the rate I trigger at and still be able to acquire indefinetely is the rate at which I transfer records from memory to my PC.
    To maximize my record transfer rate, I should transfer many records at once using the Multi Fetch VI, as opposed to the theoretically slower method of transferring one at a time.  To compare the rate that I can transfer records using a transfer all at once or one at a time method, I modified the niScope EX Timestamps.vi to allow me to choose between these transfer methods by changing the constant wired to the Fetch Number of Records property node to either -1 or 1 repectively.  I also added a loop that ensures that all records are acquired before I begin the transfer, so that acquisition and trigger rates do not interfere with measuring the record transfer rate.  This modified VI is attached to this post.
    I have the following results for acquiring 10k records.  My measurements are done using the Profile Performance and Memory Tool.
    I am using a 250kHz analog pulse source.
    Fetching 10000 records 1 record at a time the niScope Multi Fetch
    Cluster takes a total time of 1546.9 milliseconds or 155 microseconds
    per record.
    Fetching 10000 records at once the niScope Multi Fetch Cluster takes a
    total time of 1703.1 milliseconds or 170 microseconds per record.
    I have tried this for larger and smaller total number of records, and the transfer time per is always around 170 microseconds per record regardless if I transfer one at a time or all at once.  But with a 100MB/s link and 512 byte record size, the Fetch speed should approach 5 microseconds per record as you increase the number of records fetched at once.
    With this my application will be limited to a trigger rate of 5kHz for running indefinetely, and it should be capable of closer to a 200kHz trigger rate for extended periods of time.  I have a feeling that I am missing something simple or am just confused about how the Fetch functions should work. Please enlighten me.
    Attachments:
    Timestamps.vi ‏73 KB

    Hi ESD
    Your numbers for testing the PXI bandwidth look good.  A value of
    approximately 100MB/s is reasonable when pulling data accross the PXI
    bus continuously in larger chunks.  This may decrease a little when
    working with MXI in comparison to using an embedded PXI controller.  I
    expect you were using the streaming example "niScope Stream to Memory
    Maximum Transfer Rate.vi" found here: http://zone.ni.com/devzone/cda/epd/p/id/5273.
    Acquiring multiple triggered records is a little different.  There are
    a few techniques that will help to make sure that you are able to fetch
    your data fast enough to be able to keep up with the acquired data or
    desired reference trigger rate.  You are certainly correct that it is
    more efficient to transfer larger amounts of data at once, instead of
    small amounts of data more frequently as the overhead due to DMA
    transfers becomes significant.
    The trend you saw that fetching less records was more efficient sounded odd.  So I ran your example and tracked down what was causing that trend.  I believe it is actually the for loop that you had in your acquisition loop.  I made a few modifications to the application to display the total fetch time to acquire 10000 records.  The best fetch time is when all records are pulled in at once. I left your code in the application but temporarily disabled the for loop to show the fetch performance. I also added a loop to ramp the fetch number up and graph the fetch times.  I will attach the modified application as well as the fetch results I saw on my system for reference.  When the for loop is enabled the performance was worst at 1 record fetches, The fetch time dipped  around the 500 records/fetch and began to ramp up again as the records/fetch increases to 10000.
    Note I am using the 2D I16 fetch as it is more efficient to keep the data unscaled.  I have also added an option to use immediate triggering - this is just because I was not near my hardware to physically connect a signal so I used the trigger holdoff property to simulate a given trigger rate.
    Hope this helps.  I was working in LabVIEW 8.5, if you are working with an earlier version let me know.
    Message Edited by Jennifer O on 04-12-2008 09:30 PM
    Attachments:
    RecordFetchingTest.vi ‏143 KB
    FetchTrend.JPG ‏37 KB

  • No of records in SAP BI Corresponds to how many records in database

    Dear all
                   If i have 2lakh records in a datasource  then how it  Corresponds to how many records in database.

    For the first time load the datasource , the no of records in the database(source) must be equal to the number of records in the PSA(datasource).
    If you are daily deleting the records in the PSA , and doing a delta load then the no of records in PSA must be equal to the no of records in delta queue RSA7 for the corresponding datasource .
    If the load is a full load everytime or a full repair  ,then the no of  records in PSA  must be equal to the numbers of records in source.

  • Update BP Catalog numbers

    I have a customer wanting to update BP Catalog numbers using DTW. I have found SAP Note 1275393 which states that this is not possible.
    Attached to the note is an SDK project which can be used to achieve this, however this is in VB6 & our current version of VB will not let us open this.
    Has anyone converted this to a later version of VB ?

    Hi Julie,
    There are some tools available to convert the old code to new one. Have you tried to find one?
    Thanks,
    Gordon

  • Import catalog numbers for multiple BP at the same time

    I have to import different groups of the same catalog numbers for multiple business partners. I have been looking through the DTW templates and have not found it yet. Does this functionality exist and if so where would I find it? If not what to do, enter each one by one
    thx,
    Richard.

    Hi,
    I just looked at the available templates... I would use the oAlternateCatNum one.
    Regards,
    Eric

  • Function error:  Too many records

    I have writing a function that needs to return the total count of a sql statement. It will divide two calculated columns to get an average. I have two versions. Version 1 compiled successfully and I am trying to either run it in Reports or in the database and call it. I get an error stating that the function returns too many records. I understand that is a rule for stored functions but how can I modify the code to get it return one value for each time it is called?
    Here is the main calculation. SUM(date1-date2) / (date1-date2) = Avg of Days
    version1:
    create or replace FUNCTION CALC_OVER_AGE
    RETURN NUMBER IS
    days_between NUMBER;
    days_over NUMBER;
    begin
    select (determination_dt - Filed_dt), SUM(determination_dt - Filed_dt) into days_between, days_over
    from w_all_cases_mv
    where (determination_dt - Filed_dt) > 60
    and ;
    return (days_between/days_over);
    END CALC_OVER_AGE;
    version2:
    CREATE OR REPLACE FUNCTION CALC_OVER_AGE (pCaseType VARCHAR2)
    RETURN PLS_INTEGER IS
    v_days_between W_ALL_CASES_MV.DAYS_BETWEEN%TYPE;
    v_total NUMBER;
    days_over NUMBER;
    i PLS_INTEGER;
    BEGIN
    SELECT COUNT(*)
    INTO i
    FROM tab
    WHERE case_type_cd = pCaseType
    AND determination_dt - Filed_dt > 60;
    IF i <> 0 THEN
    select SUM(determination_dt-Filed_dt), days_between
    into v_total, v_days_between
    from tab
    where determination_dt - Filed_dt > 60;
    RETURN v_total/v_days_between;
    ELSE
    RETURN 0;
    END IF;
    EXCEPTION
    WHEN OTHERS THEN
    RETURN 0;
    END CALC_OVER_AGE;

    Table Structure:
    WB_CASE_NR NUMBER(10)
    RESPONDENT_TYPE_CD VARCHAR2(10)
    INV_LOCAL_CASE_NR VARCHAR2(14)
    CASE_TYPE_CD VARCHAR2(10)
    FILED_DT DATE
    FINAL_DTRMNTN_DT DATE
    REPORTING_NR VARCHAR2(7)
    INVESTIGATOR_NAME VARCHAR2(22)
    OSHA_CD VARCHAR2(5)
    FEDERAL_STATE VARCHAR2(1)
    RESPONDENT_NM VARCHAR2(100)
    DAYS_BETWEEN NUMBER
    LAST_NM VARCHAR2(20)
    FIRST_NM VARCHAR2(20)
    DETERMINATION_DT DATE
    DETERMINATION_TYPE_CD VARCHAR2(2)
    FINAL_IND_CD VARCHAR2(1)
    DESCRIPTION VARCHAR2(400)
    DETERMINATION_ID NUMBER(10)
    ALLEGATION_CD VARCHAR2(1)
    ALGDESCRIPTION VARCHAR2(50)
    Output is for Reports, I am trying to get the last calculation, which is the Average Days. The reports is grouped on Case Types and has several bucket counts for each like this.
    Case Type Count All Completed Pending Over Age Avg Days
    A 5 3 4 2 15
    Z 10 7 6 3 30
    4 8 3 5 4 22
    To get the Avg Days, the Over Age calculation is used as well as the Days Between (Determination_Dt - Filed_Dt). That is the (date1-date2) example that I gave in my first post. So the calcuation would be the SUM(Days_Between) / Days_Between.

  • How can I set limitations on how many records a report can return

    I have a report on the web using Oracle Reports builder and I have the client enter in date parameters for the report that they want.
    Well with date ranges for different clients a different number of records are returned. Because of time it can take the report to return I want to limit the number of records that report can return.
    How can I go about doing that? I don't want to limit with date parameters because date won't really work for me. I need to limit on how many records can be returned. If it exceeds 10,000 records I want the client to refine the date range of schedule the report to run later. Meaning we will run that report. So I would have two check boxes if the count was over 10,000 do you want to define your date or schedule the job to run later.
    Can any one help me with this? How would I go about this?

    To know if the report is going to return more than 10,000 records, you first have to run the query with a 'select count(1) from ... where ...' (with the same from and where clauses as you normal query). Since this takes about the same time as runnng your report, I wonder if you really gain anything (although formatting may take some time too).
    You may simplify the select count(1) query by omitting all the lookup tables that are only needed for formatting. That way your query may run a lot faster. You can put this in your after parameter form trigger.

  • Is there any limit on how many records a cursor can hold?

    Hi Everyone,
    This is Amit here. I want to know whether there is any limit on how many records a cursor can hold.
    I have a program in which i am creating a cursor and passing it to another procedure as an input parameter. But the count of cursor query is more than 15 Lakhs. The program is running forever.
    Just wanted to know whether the huge data is the problem.
    Thanks ....
    Regards,
    Amit

    user13079404 wrote:
    Just wanted to know whether the huge data is the problem.What do you think? How long does your code typically need to wait for the data to leave the magnetic platter of the harddisk, travel across wires and into the memory buffer of your application - for a single row?
    Now multiply that waiting for I/O time with a million - for a million rows. Or by a billion, for a billion rows.
    Is "+huge data+" a problem? Not really - it simple needs more work to get that amount of data from disk. More work means slower performance. It is that simple.
    Which is why the row-by-row approach used by many developers is wrong. You do not pull a million rows from disk and process it in PL/SQL or Java or .Net. Heck, you do not even pull 10,000 rows like that.
    The correct approach is to think data sets and use SQL to process that for you - and only return the bare minimum of data to the application layer. Maximize SQL. Minimize PL/SQL and Java and .Net.

  • How to execute an Oracle stored procedure which returns many records?

    I have two synchronous scenarios XI<->PI<->JDBC, where JDBC is receiver adapter.
    Each scenario runs a different stored procedure in Oracle database.
    The first stored procedure returns only one record, the second stored procedure returns many records, which come from a cursor.
    In the first scenario I executed the stored procedure following the directions of Help SAP page and works perfectly.
    Link: [http://help.sap.com/saphelp_nw70/helpdata/en/2e/96fd3f2d14e869e10000000a155106/content.htm]
    <root>
      <StatementName5>
        <storedProcedureName action=u201DEXECUTEu201D>
          <table>realStoredProcedureName</table>
          <param1 [isInput=u201Dtrueu201D] [isOutput=true] type=SQLDatatype>val1</param1>
        </storedProcedureName>
      </StatementName5>
    <root>
    I have sought in the forums of SDN and cannot find the way to run the second stored procedure, and receive the information it returns.
    Thank you for your help.
    Rafael Rojas.

    Think It doesnt matter either cursor or result set. Try to get the response back from JDBC and see what are the fields it exactly populating.
    In Procedure you can able to find the columns selecting in Cursors. Give those columns in the DT.
    File - JDBC (Execute-- Procedure)
    To get the response
    JDBC_respose -  File
    Correct me if im wrong.
    Regards
    Ramg

  • Find out how many records are updated.

    Hello,
    I am executing a Update Statement in a loop from a JSP page this is working good.
    Suppose if there are 10 records and I update 4 records, it gets updated fine.
    What I now need is to find out how many records have been updated. How can I know this?
    Thanks

    statement.executeUpdate() returns the number of records updated.
    Sai Pullabhotla

  • Vendor Catalog Numbers

    Hi all,
    I've noted that both my Goods Issue and Goods Receipt screens are blank in Vendor Columns (I.e. Customer/Vendor Ref. No, Customer/Vendor Name etc) and i want to query stock adjustments supplier wise.
    Can this be rectified by assigning Vendor Catalog Numbers to items in the system? because at the moment we're only using ItemCodes?
    And if so, whats the procedure to update the existing items with Vendor Ref. No.s and to ensure that all the new items are either automatically or manually updated? Thanks
    Regards,
    Henry

    Hi Henry,
    If you do want to query stock adjustments supplier wise, the only possible way is to set up all items with unique Vendor Catalog Numbers.  Please note, that means if you have one item with multiple Vendors, you end up with duplicate items with different codes everywhere.
    It might not be a good practice for you in the long run.

  • Report timing out, "Too many records"

    I am using CR XIR2 with a univers that I created to run an inverntory report. The report works fine in the designer but when I go to run it from the CR Server It times out. (Very quickly).  If I put filters and only pull a subset of the data the report works fine. This issue is there should not be too many records. Where can I check to make sure that there are no limitations set and I can get all of the records? I have set the parameters for the Universe to no limit records or time to execute. Is there any place else I can check? or any other ideas?
    Thanks

    What viewer do you use?
    If the viewer is ADHTML, you are using the RAS (Report Application Server). In that case, you need to set the number of (db) records to read in the RAS properties on the CMC.
    If the viewer is DHTML, Java or ActiveX, you are using the CR Page Server. You will need to set the number of db records to read in the Cryatal Reports Page Server properties on the CMC.

  • How do we track how many records INSERT/UPDATE/DELETE per day

    Hi All,
    We have around 150 tables in database. so many members/persons can access these table and every one can have rights to do INSERT,DELETE,UPDATE the records. I wanted to know How many records are UPDATED,INSERTED,DELETED for a day. is it possible to find out.
    I have some questions please clarify the doubts.
    1> If we create any table, this table gets store in All_OBJECTS/USER_OBJECTS, tabs/tab/user_tables...... we can find out table name,columns what tables were created in database.
    2> if we enter records/ delete records / update records in a table. we can able to find corresponding table. Apart from corresponding table . is there any way to find out records.
    above i said that if i create any table it will store in objects table.

    Schedule a periodic DBMS_STATS.FLUSH_DATABASE_MONITORING_INFO and query USER_TAB_MODIFICATIONS. This view shows DML activity against each table and is updated when database monitoring info is flushed. The flush is automatic but if you need control over the frequency of updates, you can explicitly call the FLUSH.
    In 9i you have to "ALTER TABLE table_name MONITORING ;" before you can use this method.
    In 10g, this is enabled by default unless STATISTICS_LEVEL is set to BASIC.
    See http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/statviews_4465.htm#sthref2375
    and
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/tables.htm#ADMIN01508
    Another useful view is V$SEGMENT_STATISTICS (or the faster view V$SEGSTAT) which provides information in the same manner as in the V$SYSTAT / V$SESSTAT views.
    Hemant K Chitale
    http://hemantoracledba.blogspot.com

Maybe you are looking for

  • Manual bank statement upload error FB727

    Hi Experts When we are uploading the manual bank statements using transaction FF67, we are getting the error message FB727 "There are no transactions that have this amount's plus or minus sign" We are using SAP version 4.7, I am not sure how this mes

  • Strange issue in opening Web Analysis document in workspace.

    Dear All, We are working on Hyperion system 9.3.1 and reports are developed using Web Analysis. We have completed report development and going to distribute to users.But we are getting very strange error message. We followed the following sequences-

  • At the time of recovering voting disk

    os:redhat linux rdbms:10.2.0.1 clusterware:10.2.01. hai... we are using three voting disks for our cluster.i have taken backup of these three voting disks indivdually.like voting_disk1_bkp,voting_disk2_bkp,voting_disk3_bkp suddenly the drive of votin

  • Please help on installing windows 7 on late 2013 imac 27"!!!

    I am using imac 27" late 2013. I tried installing windows 7 64bit using boot camp assistant and it went well. I also download the latest bootcamp 5. However i cant install bootcamp on the new windows 7. Anyone have the same problem?

  • Possible to restrict logins on hosts w/ ACI?

    Greetings, My DS impelementation is beginning to mature some, and I'm not faced with the possibility that I may need to restrict logins on various hosts. My current setup is using LDAP netgroups to define who can log in to which server, but now I'm b