In Bdc I have huge volume of data to upload for the given transaction

Hi gurus,
In Bdc I have huge volume of data to upload for the given transaction, here am using session method, it takes lots of exection time to complete the whole transaction, Is there any other method to process the huge volume with minimum time,
reward awaiting
with regards
Thambe

Selection of BDC Method depends on the type of the requirement you have. But you can decide which one will suite requirement basing the difference between the two methods. The following are the differences between Session & Call Transaction.
Session method.
1) synchronous processing.
2) can tranfer large amount of data.
3) processing is slower.
4) error log is created
5) data is not updated until session is processed.
Call transaction.
1) asynchronous processing
2) can transfer small amount of data
3) processing is faster.
4) errors need to be handled explicitly
5) data is updated automatically
Batch Data Communication (BDC) is the oldest batch interfacing technique that SAP provided since the early versions of R/3. BDC is not a typical integration tool, in the sense that, it can be only be used for uploading data into R/3 and so it is
not bi-directional.
BDC works on the principle of simulating user input for transactional screen, via an ABAP program.
Typically the input comes in the form of a flat file. The ABAP program reads this file and formats the input data screen by screen into an internal table (BDCDATA). The transaction is then started using this internal table as the input and executed in the background.
In ‘Call Transaction’, the transactions are triggered at the time of processing itself and so the ABAP program must do the error handling. It can also be used for real-time interfaces and custom error handling & logging features. Whereas in
Batch Input Sessions, the ABAP program creates a session with all the transactional data, and this session can be viewed, scheduled and processed (using Transaction SM35) at a later time. The latter technique has a built-in error processing mechanism too.
Batch Input (BI) programs still use the classical BDC approach but doesn’t require an ABAP program to be written to format the BDCDATA. The user has to format the data using predefined structures and store it in a flat file. The BI program then reads this and invokes the transaction mentioned in the header record of the file.
Direct Input (DI) programs work exactly similar to BI programs. But the only difference is, instead of processing screens they validate fields and directly load the data into tables using standard function modules. For this reason, DI programs are much faster (RMDATIND - Material Master DI program works at least 5 times faster) than the BDC counterpart and so ideally suited for loading large volume data. DI programs are not available for all application areas.
synchronous & Asynchronous updating:
http://www.icesoft.com/developer_guides/icefaces/htmlguide/devguide/keyConcepts4.html
synchronous & Asynchronous processings
Asynchronous refers to processes that do not depend on each other's outcome, and can therefore occur on different threads simultaneously. The opposite is synchronous. Synchronous processes wait for one to complete before the next begins. For those Group Policy settings for which both types of processes are available as options, you choose between the faster asynchronous or the safer, more predictable synchronous processing.
By default, the processing of Group Policy is synchronous. Computer policy is completed before the CTRLALTDEL dialog box is presented, and user policy is completed before the shell is active and available for the user to interact with it.
Note
You can change this default behavior by using a policy setting for each so that processing is asynchronous. This is not recommended unless there are compelling performance reasons. To provide the most reliable operation, leave the processing as synchronous.

Similar Messages

  • How to get the date of Friday for the given week

    HI,
    I have a requirement to get the date of Friday for the given week.
    Eg: I have an input of 200722 (Yearweek), From this I need to get the Date of the friday for this week 22 of Year 2007.
    Plz let me know how to get this..
    Thanks in advance..
    Sridhar.

    Hi,
    Thanks for your reply...
    I have implemented your logic but not getting exact output as required.
    Suppose, If i give the input as 200720, I am getting an output as 18 (friday of week 20 of year 2007), but i need to get an output of 18.05.2007
    Plz let me know how to do thdi.
    Thanks in advance..

  • Displaying a group of data or info for the given key in a property file

    Hi,
    I have a property file suppose queries.properties.
    it contains the propertys and values as
    profile_1=abc
    profile_2=dfg
    profile_3=ser
    name_1=qwe
    name_2=dfg
    name_3=cbvb
    my requirement is when i pass the key profile then it has to give
    profile_1=abc
    profile_2=dfg like all profile related properties and its values.

    it should display all the profile related keys and its corresponding values.
    the reason behind this requiremnt is to improve the performance .
    if we pass key then at a time we will get the value of that key.
    but if ihave set all the profile related properties and its values then i need to get all at once .

  • Data with huge volume of data with DTP

    Hi Experts,
    I have this problem with upload of huge volume of data with DTPs.
    I have my initialisation done as I am doing reloads, Now I have this data from fiscal year period 000.2010 to 016.9999.
    I have huge volume of data.
    I have tried uploading this data in chunks by dividing 3 months for each DTP and had made full load.
    But when I processed the DTP the data packages are decided at source and I have about 2000 data packages.
    Now my request is turning to red after processing about 1000 datapackages, batch processes allocated to this also stopped.
    I have tried dividing DTP only by month and processed the DTP I have same problem. I have deleted the indexes before uplaoding to the cube, Changed the setting battch processing from 3 to 5.
    Please can any one advise what could be problem.I am uplaoding this reloads in quality system.
    How can upload this data which are in millions.
    Thanks,
    Tati

    Hi Galban,
    I have made the parallel processing from 3 to 5 even and the datapakcage size
    Can you please advise in this area how can I increase the data package size as the data package size for my upload is the package size corresponds to package size in source it is determined dynammically at runtime.
    Please advise.
    Thanks
    Tati

  • Error while extracting huge volumes of data from BW

    Hi,
    we see this error while extracting huge volumes of data (apprx 3.4 million and with more no.of columns) and we see this error.
    R3C-151001: |Dataflow DF_SAPSI_SAPSI3131_SAPBW_To_Teradata
    Error calling R/3 to get table data: <RFC Error:
    Key: TSV_TNEW_PAGE_ALLOC_FAILED
    Status: EXCEPTION SYSTEM_FAILURE RAISED
    No more storage space available for extending an internal table.
    >.
    We are not sure if DoP works with source as SAP BW, but when tried with DoP also, we got the same error.
    Will this issue be resolved with an R/3 or ABAP dataflow? Can anyone suggest some possible solutions for this scenario?
    Sri

    The problem is that you've reached the maximum memory configure for your system.
    If this is batch job reconfigure the profile parameter
    abap/heap_area_nondia
    Markus

  • I have huge amount of data on a windows external drive and want to transfer to a Mac drive.  Does anyone know an easy way to do this?  I have almost 2TB of data to transfer.  Thanks.

    I have huge amount of data 2TB on a windows Fantom external drive and want to transfer to a Mac drive.  Does anyone know an easy way to do this?  Thanks.  I have an IMac 3.5 GHz Intel Core i7.  I haven't bought a Mac external yet. 

    Move your data to a new Mac - Apple Support

  • Huge volume of records are routing to the remote user other than his position and organization records. Synchronization and DB initialization taking more time around 36 hours.

    Huge volume of records are routing to the remote user other than his position and organization records. Synchronization and DB initialization taking more time around 36 hours.
    Actual accounts & contacts need to be route around 2000 & 3000 but we have observed lakhs of records routing into local DB.
    We have verified all the Assignment Rules, Views.
    We ran docking object visibility rules and we have observed that some other accounts are routing due to Organization rule passing. (these records are not supposed to route).
    Version Siebel 7.7.2.12,
    OS Solaris.

    let me know what would be the reason that 1st million takes only 15 minuts and the time goes on increasing gradually with the increase of dataYes that's a little strange. I only can guess:
    1. You are in archivelog mode and the Archiver is not able to archive the redo logs fast enough
    2. You don't use Direct Load and DBWR ist not able to write the direty block to disk fast enough. You could create more DBWR processes in that case.
    3. Make a snapshot of v$system_event:
    create table begin as select * from v$system_event;After the import run
    create table end as select * from v$system_event;Now compare the values:
    select * from begin order by TIME_WAITED_MICRO descwith the values given you by
    select * from end order by TIME_WAITED_MICRO descSo you can look where your DB spent so much time waiting for something.
    Alternativly, you could start a 10046 trace on the loading session and use tkprof.
    Dim

  • HT201401 I have no volume on recorded videos; what could the problem be?

    I have no volume on recorded videos; what could the problem be?

    you have no control, or you have no sound?

  • I have an iphone 5s, When i checked for the avialable space it said i had over 3Gb of "Other" files in my iphone. I'v only had my iphone for three days, is it normal to have that much "other" data?

    I have an iphone 5s, When i checked for the avialable space it said i had over 3Gb of "Other" files in my iphone. I'v only had my iphone for three days, is it normal to have that much "other" data?

    Around a gig is normal.
    Also, if you tend to send/receive a lost of MMS messages, that will cause other to increase.  Especially if you keep those messages.

  • Data Recovery Software for the Zen Vision M

    Data Recovery Software for the Zen Vision M:
    Has anyone had any success recovering lost sound files from a Vision M, with software, physical repair services, or otherwise? Does anyone claim to be able to recover information from these devices?
    The data recovery programs I have tried fail to recognize the dri've, since it doesn't show up in list of dri'ves for some reason (Is there a way to make this show up?). Any suggestions/ recommendations would be greatly appreciated as I lost over a year of recordings after an unexpected forced reformatting. How would one even go about finding someone for this job?
    Thanks, GW

    Haven't really tried the data recovery so I would like to know more about this one too.
    http://storeyourpicture.com/images/s...lectronics.jpg

  • [HELP] Error: "JDBC theme based FOI support is disabled for the given data"

    Hi all,
    I have already set up MapViewer version 10.1.3.3 on BISE1 10g OC4J server. I am current using JDK 1.6. I create a mvdemo/mvdemo user for demo data.
    The MapViewer demo is running fine with some demo without CHART. But give this error with some maps that have CHART like: "Dynamic theme, BI data and Pie chart style", "Dynamic theme and dynamic Bar chart style". The error is:
    ----------ERROR------------
    Cannot process the FOI response from MapViewer server. Server message is: <?xml version=\"1.0\" encoding=\"UTF-8\" ?> <oms_error> MAPVIEWER-06009: Error processing an FOI request.\nRoot cause:FOIServlet:MAPVIEWER-06016: JDBC theme based FOI support is disabled for the given data source. [mvdemo]</oms_error>.
    ----------END ERROR------
    I searched many threads on this forum, some point me to have this allow_jdbc_theme_based_foi="true" in mapViewerConfig.xml and restart MapViewer.
    <map_data_source name="mvdemo"
    jdbc_host="localhost"
    jdbc_sid="bise1db"
    jdbc_port="1521"
    jdbc_user="mvdemo"
    jdbc_password="mvdemo"
    jdbc_mode="thin"
    number_of_mappers="3"
    allow_jdbc_theme_based_foi="true"
    />
    Error Images: [http://i264.photobucket.com/albums/ii176/necrombi/misc/jdbcerror.png|http://i264.photobucket.com/albums/ii176/necrombi/misc/jdbcerror.png]
    I have this configuration, but no luck. Could anyone show me how to resolve this problem?
    Rgds,
    Dung Nguyen

    Oop, i managed to have this prob resolved!
    My prob may come from this I use both scott and mvdemo schema for keeping demo data (import demo data).
    Steps i made to resolve my prob are:
    1) Undeploy MapViewer from Application Server Control (http://localhost:9704/em in my case)
    2) Drop user mvdemo
    3) Download mapviewer kit from Oracle Fusion Middleware MapViewer & Map Builder Version 11.1.1.2
    4) Deploy MapViewer again
    5) Recreate mvdemo and import demo data
    6) Run mcsdefinition.sql, mvdemo.sql with mvdemo user (granted dba already)
    7) Edit mapViewerConfig.xml
    <map_data_source name="mvdemo"
    jdbc_host="dungnguyen-pc"
    jdbc_sid="bise1db"
    jdbc_port="1521"
    jdbc_user="mvdemo"
    jdbc_password="!mvdemo"
    jdbc_mode="thin"
    number_of_mappers="3"
    allow_jdbc_theme_based_foi="true"
    />
    Save & Restart MapViewer
    And now, all demos run fine. Hope this helpful for who meet prob like my case.

  • Passing values to subreport in SSRS throwing an error - Data Retrieval failed for the report, please check the log for more details.

    Hi,
    I have the subreport calling from the main report. The subreport is based on MDX query agianst the SSAS cube. some dimensions in cube has values 0 and 1.
    when I try to pass '0' to the sub report as the parameter value, it gives an error "Data Retrieval failed for the report, please check the log for more details".
    Actually I am using table for storing these parameter values. In the main report I am calling this table (dataset) and passing these values to subreport.
    so I have given like [0],[1] and this works fine. when I give only either [0] or [1] then it is throwing an error.
    Could you please advise on this.
    Appreciate all and any help.
    Thanks,
    Divya

    Hi Divya,
    Based on the current description, I understand that there is no issue if you pass two values from main report to subreport, while the issue occurs when passing one value to subreport.
    To narrow down the issue, I want to confirm whether the subreport can run if there is only [0] or [1] in the subreport. If so, it indicates the query statements exist error in the subreport. If it’s not the case, this shows the issue occurs during passing
    values from main report to subreport. To make further analysis, please post the details of query statements of the subreport to the forum.
    Regards,
    Heidi Duan
    Heidi Duan
    TechNet Community Support

  • DTW error message - Data too large for the field65171

    Am trying to run up General Legder Accounts using DTW in Version 8.8 and have been successful with Level 2.  The customer has three levels only.  The total number of characters for the various segments is 18 and I have double and triple checked that the column "Code" has exactly 18 characters.  Even used "=trim" in Excel several times.
    When I run DTW for level 3 it errors out and it displays the above error message.  What is field 65171???
    I thought DTW was supposed to be so much improved with 8.8 - where are help files that tell you what the errors are???
    Regards - Zal

    Hi ZAL,
    I am facing this error too when exporting chart of account through DTW SAP 8.8 PL 19, although some of accounts are imported but in somme accounts saying the " data too large for the filed65171".
    Account code is of 7, as it reuires 9 Max,
    Please suggest me on this .
    Regards
    rps

  • Data retrieval failed for the subreport

    Hi All,
    I m getting this error whn i try to bind a sub report within a report,
    Data retrieval failed for the subreport, 'Subreport2', located at: E:\Anil\Sample.rdlc. Please check the log files for more information.
    Thanks,
    Anil Kumar Dhiman

    Hi Anil,
    This is a know issue which you can see in the link:https://connect.microsoft.com/SQLServer/feedback/details/648560/subreport-with-shared-dataset-throws-error
    This issue occurs only on Business Integrated Development Studio (BIDS) and use a shared dataset in subreport. And it is fixed in SQL Server Reporting Services 2012. So you can avoid this issue by changing the shared dataset to embedded dataset or deploy the
    shared dataset to Report Server and view report on Report Manage.
    If you have any questions, please feel free to ask.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Data source properties for the rulefiles.

    Could some one explain what is the use of field edits like move/split/join/create of data source properties for the rulefiles.

    move/split/join/create are very useful loading data without going back to source file and changing them. For example
    you have an entity dimension with member names like
    east10569
    east10658
    if you want to add a prefix for these members to make them unique, you can use those field edits.
    the other uses which I saw is concatination to create unique names etc.
    hope this helps

Maybe you are looking for

  • Can anyone help me with iPod/iTunes syncing problems after recent software update?

    Hi there, I wonder if anyone can help me.  Last week I updated my iPod to the latest software (8.2).  Today I deleted some albums from iTunes and added a few others.  I also made some changes to a few of my playlists.  I then synched iTunes and my iP

  • Outsourcing of Garnishment to ADP

    Hi, I am new to SAP HR. In a company, Garnishment has been outsourced to ADP.  What does it really mean?  What processes are involved, I don't have understanding of them. Can somebody please explain to me? Thanks, BBC

  • External drive compatibility

    My built-in superdrive stopped working, so I'm looking into external replacements. Does anyone know if the Lite-On eZAU120-08 is compatible with OS 10.4.11? Thanks.

  • Smart Album to find No Description

    I am using iPhoto 9.2.3 on OS10.6.8. I used to be able to have a smart album with the condition "Description is" and I would leave the text field blank. This would give me all my photos that do not have any description. Now it does not select any pic

  • "An unknown error has occurred in the HsvDataSourceImpl object." - HFM App

    Hi, their I need help to solve HFM application’s related issue Installed: - Database-Oracle 11g & EPM 11.1.1.3 (HSS, EPMA, HFM, Workspace, Reporting & Analysis) Scenario: - I have created total 4 applications, 2 from Workspace (EPMA) classic applicat