B1IF mass data process(convert) issue

Hi All,
I have a issue converting a large amount of data that I queried from my SAP database via a B1IF sql call.
The data must be covert to another format before converting the xml to a string to be dropped on MQ.
I have 120 other messages that works the same way without any issue.
The process just stops when I try to convert the data to a string.
The atoms with data looks like this before passing it to the XML2TXT call.The xml disapeared and changed to text (maybe this is because it is a mass of data ?)
Please assist...
Kind Regards,
Brenden Draper

Hi All,
I have a issue converting a large amount of data that I queried from my SAP database via a B1IF sql call.
The data must be covert to another format before converting the xml to a string to be dropped on MQ.
I have 120 other messages that works the same way without any issue.
The process just stops when I try to convert the data to a string.
The atoms with data looks like this before passing it to the XML2TXT call.The xml disapeared and changed to text (maybe this is because it is a mass of data ?)
Please assist...
Kind Regards,
Brenden Draper

Similar Messages

  • Call Bundling for custom bapi for mass data processing

    Hi all,
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/4c/4c0e96725311d396a80004ac96334b/frameset.htm
    can i create a custom bapi where i can compress created update tasks. Not single inserts but a single sql insert with many records.
    Are there some SAP FM in order to do it? Documentation says, i must do "Operations in buffer" and "Update buffer data".
    Regards
    Paul

    Is ABAPFIELD an IMPORTING parameter?
    > Total Questions:  17 (15 unresolved) 
    Maybe you should consider cleaning up your old posts.
    Rob

  • Parallel processing of mass data : sy-subrc value is not changed

    Hi,
    I have used the Parallel processing of mass data using the "Start New Task" . In my function module I am handling the exceptions and finally raise the application specific old exception to be handled in my main report program. Somehow the sy-subrc is not getting changed and always returns 0 even if the expection is raised.
    Can anyone help me about the same.
    Thanks & Regards,
    Nitin

    Hi Silky,
    I've build a block of code to explain this.
      DATA: ls_edgar TYPE zedgar,
            l_task(40).
      DELETE FROM zedgar.
      COMMIT WORK.
      l_task = 'task1'.
      ls_edgar-matnr = '123'.
      ls_edgar-text = 'qwe'.
      CALL FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' STARTING NEW TASK l_task PERFORMING f_go ON END OF TASK
        EXPORTING
          line = ls_edgar.
      l_task = 'task2'.
      ls_edgar-matnr = 'abc'.
      ls_edgar-text = 'def'.
      CALL FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' STARTING NEW TASK l_task PERFORMING f_go ON END OF TASK
        EXPORTING
          line = ls_edgar.
      l_task = 'task3'.
      ls_edgar-matnr = '456'.
      ls_edgar-text = 'xyz'.
      CALL FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' STARTING NEW TASK l_task PERFORMING f_go ON END OF TASK
        EXPORTING
          line = ls_edgar.
    *&      Form  f_go
    FORM f_go USING p_c TYPE ctype.
      RECEIVE RESULTS FROM FUNCTION 'Z_EDGAR_COMMIT_ROLLBACK' EXCEPTIONS err = 2.
      IF sy-subrc = 2.
    *this won't affect the LUW of the received function
        ROLLBACK WORK.
      ELSE.
    *this won't affect the LUW of the received function
        COMMIT WORK.
      ENDIF.
    ENDFORM.                    "f_go
    and the function is:
    FUNCTION z_edgar_commit_rollback.
    *"*"Interface local:
    *"  IMPORTING
    *"     VALUE(LINE) TYPE  ZEDGAR
    *"  EXCEPTIONS
    *"      ERR
      MODIFY zedgar FROM line.
      IF line-matnr CP 'a*'.
    *comment raise or rollback/commit to test
    *    RAISE err.
        ROLLBACK WORK.
      ELSE.
        COMMIT WORK.
      ENDIF.
    ENDFUNCTION.
    ok.
    In your main program you have a Logical Unit of Work (LUW), witch consists of an application transaction and is associated with a database transaction. Once you start a new task, your creating an independent LUW, with it's own database transaction.
    So if you do a commit or rollback in your function the effect is only on the records your processing in the function.
    There is a way to capture the event when this LUW concludes in the main LUW. That is the PERFORMING whatever ON END OF TASK. In there you can get the result of the function but you cannot commit or rollback the LUW from the function since it already have implicitly happened at the conclusion of the funtion. You can test it by correctly comment the code I've supplied.
    So, if you  want to rollback the LUW of the function you better do it inside it.
    I don't think it matches exactly your question, maybe it lead you on the right track. Give me more details if it doesn't.
    Hope it helps,
    Edgar

  • Value Mapping Replication for Mass Data - Performance Issues

    Hi All,
    We are looking into Value Mapping Replication for Mass Data. We have done this for less number of fields.
    Now we might have to have 15,000 records in the cache for the Value Mapping. I am not sure how this would effect the Java Cache and Java Engine as a whole.
    There might be a situation where we will have to leave the 15K records in the cache table on Java Engine...
    Are there any parameters that we can look into just to see how this hits the performance.
    Any links/ guidance in the right direction might help me..
    reg

    Naveen,
    Check jins reply in this thread (they have done with API and without API using graphical but still some issues):
    Value mapping performance using LookUp API
    ---Satish

  • Issues in mass data Upload

    Hi All,
            Hope you all are doing fine.
            I have to do master data upload for my next project. I have gone through LSMW, DX-WB, Recording etc. and now I am quite comfortable with all of these.
            As i have tested these tools for maximum 8-10 records, I am interested in hearing from you all, your experiences regarding actual data upload where volume is high and data may be difficult to verify manualy.  Particularly I am interested in how to make upload faster,Error free,consistent(No record being posted twice etc.)
            Your inputs would be higlhly appreciated.
            Thanks a lot for patient reading.
    Bye and Regards.

    Hi navdeep,
    the mass data upload depending on tht data u want to upload, there ara several function module to upload data like create reservation or upload long text inspection method tell me what the data u want to upload
    Best Regard
    Waleed Sadat

  • Developing SAP Reports Using Mass Data Runtime( MDR)?

    Hi Experts,
    Does anyone have the idea of MDR reporting?
    Actually i have implemented this in my previous project which is used for parallel processing.
    The problem which I am facing currently is that when i try to open the transaction /BTR/MDR it is throwing an error as "transaction doesn't exits'.
    Why this transaction is not opening in the system? Please guide me on this issue
    Thanks in Advance

    Hi Nawaz,
    Mass Data Runtime (MDR) is a third party software component. If it has not been purchased and installed in the system you are working on, it is unlikely that the transaction will exist.
    Hope that helps.
    Christian

  • Printing Adobe PDF using mass/batch processing

    I've got Purchase Order, Contracts and RFQ forms created using Adobe PDF print forms. Currently, immediate processing is used to print/email/fax the forms (that means that after a PO/Contract/RFQ is created, it is immediately printed/emailed/faxed). An enhancement I'm working on is to allow for mass/batch processing so that the forms will be printed/emailed/faxed at a set time each day. The issue I'm facing is that using mass processing, the PDFs become corrupted (for example, the first PDF form is sent ok but subsequent forms become corrupted or become a duplicate of the first one). When customers receive the corrupted PDF, they get the error that the form cannot be opened.
    Does anyone have a suggestion on how to accomplish batch/mass processing using Adobe PDF forms? Right now, the immediate processing path is working fine since the there is low volume. However, the goal is to switch to mass processing once volume increases.

    I copied and made modifications to the SAP standard print program SAPFM06P and called it ZSAPFM06P. I also made copies of the include files. So for immediate output processing, each time a Purchase Order PDF is created it calls the print program ZSAPFM06P. This process works great no matter how many PDFs are created from POs.
    The issue is with BATCH/MASS PROCESSING. Here is the scenario:
    - Purchase requisitions are created anytime during the day.
    - Every 15 mins., the purchase reqs are batched together and an automatic process starts that converts them to purchase orders.
    - Creating each PO calls the print program ZSAPFM06P and depending on the communication strategy, the POs are either printed, emailed or faxed to the customer.
    The problem seems to be that when multiple POs are created, some get corrupted and some become duplicate of other POs. However, using immediate processing, this is never the case no matter how many POs are created.
    I checked the program and the internal tables and variables used in the code do get cleared/reset each time a PO is created so I don't see how using immediate processing works while batch processing doesn't even though they both call the same program.
    I hope this makes it clear a little bit.

  • Custom Data Processing Extension, use in SSRS Report Properties - References

    I've built a Custom Data Processing Extension (CDPE) and registered it
    successfully (ie. it shows up in the new datasources dialog/drop-down and saves just fine, for VS2010-2014). It is intended to be a custom (XML-based) DataSource. However, based on the "nature of the beast", I also need to have a Custom Query
    Designer (CQD) for development  testing of the CDPE.
    Here are the errors I get for the CQD:
    Pulling a report up in "Report Preview", which is wired to the CDPE->CQD, I get:
    "An error occurred during local report processing. The definition of the report '/TestDS' is invalid. Error while loading code module: 'Microsoft.ReportingServices.Interfaces, Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'. Details:
    Could not load file or assembly 'Microsoft.ReportingServices.Interfaces, Version=11.0.0.0, Culture=neutral, PublicKeyToken 89845dcd8080cc91' or one of it's dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception
    from HREULT: 0x80131040)"
    My CDPE directly includes Microsoft.ReportingServices.Interfaces.dll AND matches everything as far as version and key. It also includes Microsoft.ReportingServices.QueryDesigners.dll required for the CQD.
    I've written other WORKING CDPEs but not one with a CQD (Query Designer Custom replacement in Visual Studio). All the references from what I can tell are OK. I think CQDs are screwed up for XML datasources. The interfaces are not right.
    (will explain further on)
    From the "Data Sources", Dataset Properties, I click on the "Query Designer", I get:
    "An error occurred while loading the query designer 'DATASET' (which is the name of the CDPE). Query Designer: Object reference not set to an instance of an object."
    I "think" XML type CDPEs are trying to execute a web services call, versus working properly/CORRECTLY with a
    text-based query for XML. The reason I say this is that I've created both WinForm and WebForm test harnesses. They both come up with this error: "...Failed to prepare web request for the specified URL. (rsXmlDataProviderError), Invalid
    URI: The URI is empty." (which is nonsense, there is no request, the query is simply text/file-based stuff, and I read locally ALL of the XML data expected for testing without issue -> I'm ONLY making the CDPE XML-based because I have custom
    WCF calls which already work). (If you really want to understand overall architecture, please see my post: http://social.msdn.microsoft.com/Forums/en-US/d15d9206-95d7-473a-a7f9-a38b4279de8c/ssrs-extension-which-to-use?forum=sqlreportingservices&prof=required
    Other than "100 mile" overviews from Microsoft, this has got to be some of the worst documented stuff I've ever seen (
    http://msdn.microsoft.com/en-us/library/microsoft.reportingservices.interfaces.iquerydesigner.aspx ). Remote Debugging it doesn't work 95% of the time.
    My environment is VS2013 Ultimate with BI and SQL Server 2012 SP1.
    Thanks Rob
    Rob K

    Update:
    I can now see the Custom Query Designer and get anticipated results (after some fooling around with different combinations).
    Here's how things were broken by the MS SQL Server 2012 product/release team:
    1. they upgraded to .Net v4.x here (to support SharePoint, AX, MS Data Tools, etc.)
    C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\Microsoft.ReportingServices.QueryDesigners.dll
    C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\Microsoft.ReportingServices.Interfaces.dll
    2. they left c:\Program Files\Microsoft SQL Server\MSRS.11.MSSQLSERVER\Reporting Services\ReportServer\bin\Microsoft.ReportingServices.Interfaces.dll at .Net
    v2.x
    3. they don't support Custom Extensions (which use a Query Designer) with anything higher  than .Net v3.5
    In my case, I had to segregate:
    a. Report Definition Custom Extension to v4.5
    b. Custom Data Processing Extension to v3.5
    c. Custom Query Designer to v4.x
    d. my WCF/SSO to v4.5.1.
    #2 and #3 above, in my humble opinion are simply dead wrong as far as what you ever want to do in the release cycle (I can see there being an early/first release exception(s), but 2 years out and a successor product (2014) should have meant that this
    was rectified more than a year ago.)
    Whomever failed to get this communicated in the 2012 documentation created even more havoc than an average developer can decipher:
    http://msdn.microsoft.com/en-us/library/microsoft.reportingservices.interfaces.iquerydesigner(v=sql.110).aspx
    (I'm still working on how to get the remote debugger working consistently.)
    Rob

  • Calling a Stored Procedure using SSRS Custom Data Processing Extension

    I need SSRS Custom Data Processing Extension to call a stored procedure for my ssrs report. I refered many links regarding this, but i cannot find it. Instead of that there are examples for Data processing extensions that uses XML files and also multiple
    data sources.
    I want Data Processing Extension to call a stored procedure.
    Please Help. Thanks in advance

    Sorry why do you need a Data Processing Extension for that? Cant you directly call the procedure
    from SSRS dataset? Whats the RDBMS which holds this procedure?
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Error while executing master data process chian

    hi,
    I'm trying to execute Master Data Process Chain in BI 7.0. But i'm getting errors at the DTP process of two info objects. The error message is like "equest 357 is already being processed  and Exception CX_RSBK_REQUEST_LOCKED logged."
    Can anyone tell the reason and how to resolve this issue?
    Thanks
    Hima

    Hi Hima
    Check if there is any other loads, fetching the same Request OR you can goto SM37 and goto Job log and check the status of the JOB... if it is finished, then check the status of the request too.You can repeat the locked request, if the previous req is either successfully completed or failed...wait until the dependent request is finished
    just check this and repeat the req.. it should get successful now...
    Regards
    Rohit

  • Mass data load into SAP R/3 - with XI?

    Hi guys!
    I have an issue - mass data migration into SAP R/3. Is XI a good solution? It will be about 60GB of data. Or is there a better way of this data load?
    Thanx a lot!
    Olian

    hi,
    SAP doesn't recomment using XI for mass data migration
    and 60 Gb is certainly too much
    use LSMW for that purpose
    Regards,
    michal

  • Flat File Active Sync - Notify  admin incase of data processing errors

    Dear Friends,
    We have couple of Requirements to use OOTB flat file active sync adapter
    1. To read data from a flat file and update the records in Sun Identity Manager system
    2. Notify admin if there are any data processing errors while reading data from a flat file. The data processing errors can occur if there is an invalid data. for example, lets say the input flat file has 3 columns defined, but the file conatins records which has four values.
    firstname,lastname,email
    testfirst,testlast,[email protected],12345
    Req#1 is working fine. There are no issues with that.
    Req#2: if the file contains invalid data, i noticed that the active sync adapter throws an Array Index out of bound exception. so, we need to send an email notification to the admin whenever data processing errors occurs.
    I noticed that whenever the data processing exception occurs, the active sync adapter stops processing records and the active sync input form was not triggered. Unless active sync form was triggered, it's very difficult to determine whether the data was read successfully or not.
    Please let me know if there are any configurations/customizations to me made on OOTB flat file active sync adapter to handle data processing errors and send email notifications to administrators.
    Appreciate your help
    Thanks
    Vijay

    Hi,
    We have same requirement that
    "Notify admin if there are any data processing errors from a flat file.
    The data processing errors can occur if there is an invalid data or account is locked etc..."
    In short notify admin if any error logged in Active sync Log file while active sync runs.
    Yes,I noticed same that whenever the data processing exception occurs, the active sync adapter stops processing records and the active sync input form was not triggered. Unless active sync form was triggered, it's very difficult to go ahead to meet the requirement.
    Please let me know if there are any configurations/customizations to me made on flat file active sync adapter to send email notifications to administrators.
    Thanks,
    Sudheer

  • Automatic import mass data regional structure -  Program RSADRLSM01

    Hello,
    regarding the automatic import mass data to the regional structure via
    program RSADRLSM02, we are working in order to replace our third party
    provider.
    That is why, we need to deleted all the data imported from the city
    file and the references before to import new provider data.
    We have checked the SAP procedure defined in SAP note 132948 and the
    mentioned program RSADRLSM01 but we need confirm that the regional estructure informed in the old documents saved in the system could be impacted if the program RSADRLSM01 is executed.
    Any experience in this kind of process?
    Thanks in advance.
    Juan Carlos

    Since no one has replied - why not just try this in your test system and see what happens?

  • Mass data update in Value mapping table

    Hi ,
      I have used Value mapping replication to update the mass data from external source to Value mapping table . Its updatating in Runtime Cache but i want the data to be visible in GUI Value mapping table as well. Is it possible ? Because i doubt that the data in Runtime Cache may get  removed if the system restarts . Can any one help ?
    Thanks
    Laks

    Hi NALLAM GUNA RANJAN,
      Thanks for your prompt reply but i didn't get what you are trying to convey . My issue here is
    Instead of manually entering key-value pair in Value mapping table , I used Value mapping replication ( http://help.sap.com/saphelp_nw04/helpdata/en/2a/9d2891cc976549a9ad9f81e9b8db25/frameset.htm )
    Its updating the data in Runtime Cache ( you can see this using Cache Monitoring )  but not able to view the data in Actual Value mapping Table (GUI in Directory of SAPXI) I want the data updated using Replication to be visible in GUI table is it possible ?
    Hope you got the question much better now
    Thanks
    Laks

  • System Master Data Process Chain

    Hello Guys,
    I am working on Admin cockpit. So far it s going well. However, I am struglling with scheduling System Master Data Process chain. When I look at its logs, I see that Operation Type (WHM)- Texts becomes red and Attribute Change Run (final step) has not finished yet(almost three hours) Please help.
    Thank you,
    OLGA

    refer this..similar issue is discussed Alpha Conversion
    Non Alpha Compliant Value..hw can this be resolved on BI 7.0 SP 13

Maybe you are looking for

  • Error  Processing "Account" Dimension in BPC 7.5

    Good Afternoon, I am having difficulties processing the account dimension from the Admin function.  I can not save to server or process dimension.  I get the following log error message. This is impeding my ability to add or change account informatio

  • Dare to update a running 5.0 to 5.0.1?

    Hi Everybody, I finally made it to have a running iTunes 5.0 even though it occupies about 100 MB RAM and burning is very, very slow! Also I have changed language settings (now back to english). Would you dare to update to 5.0.1 or would I be better

  • Acrobat Reader 5.0.1

    Acrobat Reader バージョンは 5.0.1を使っています. PDFファイルをダウンロードしたときに対応してないところがあるとかで.バージョンアップをというような内容が出るのですが.どうしたら良いのか分りません. 無料でバージョンアップできるのでしょうか.どうやったらよいのでしょう? ウィンドウズXP ホームエディション 256MB

  • N80 INTERNET EDITION, SMS ACCELERATOR n VPN

    Hey can anybody tell me if the SMS Accelerator comes pre-installed on N80 I.E since there is no mention of it on the support/software pages. My N80 I.E's firmware version is 4.0632.0.38 ( latest as per my limited knowledge ). Also on the N80 I.E supp

  • Photoshop CC 2014 running out of memory?

    I recently reinstalled Windows 7 Pro (64-bit) so reinstalled CC 2014. I do a variety of image cropping on 12MP images. Before the reinstall it had no issues having 7 images loaded at once. Since the rebuild I get a Windows Low Memory warning every ti