Load from MQ Queue

Hi,
We have got a requirement where we need to load the data from MQ Message queue.
What is the approach we have to take in terms km,odi model,topolgy etc and how to read and load into oracle tables.
Cheers

guys,
Any pointers here?
Cheers

Similar Messages

  • Delta package not fetching all records from Delta queue in r/3

    Hello,
    I have loaded Goods Movement Data using 2LIS_03_BF datasource into my BI system.
    The Delta has been initialized and everyday the delta is being moved from r/3.
    I observed that when I execute my delta package not all delta records are fetched into PSA from r/3.
    For Ex: Before starting my delta package I checked in SMQ1 of my R/3 system and see that there are around 1000 records.On executing the delta package I see that the record count is reduced from 1000 to 400 in SMQ1.On executing the delta package again I get the remaining 400 records into my PSA.
    Shouldn't the delta package get all records from the queue on single execution??
    Is there some setting to define the nr of records to be loaded?
    I'm confused with this behaviour.Please help me understand this behaviour.
    Thank You.

    Hello,
          First thing: the data is not transferred from the SMQ1 queue, rather the data is transfered to BW from the RSA7 Delta queue. You need to check the number of records present in the RSA7 queue.
    Since SMQ1 is involved, i think you are using the unserialized V3 update method. In this method, when data is first written to the application tables, it is first transferred to the SMQ1 update queue,then via a job to the LBWQ extractor queue and then to the RSA7 delta queue. So the number of entries that you see in the SMQ1 queue are not the number of entries that have to be transferred to BW but rather the records that are waiting to be transferred to the extractor queue in LBWQ. Similarly, in LBWQ, the number of entries displayed here are not the no of entries that are going to be transferred to BW, they are the no of entries that will be transferred to the delta queue RSA7 when the next v3 update job runs.
    If you want to check the number of records that will be transferred to BW, select the datasource in rsa7 and then click on the display data entries button.
    Hope this helps.
    Regards.

  • GetDescriptions() of an object loaded from local variable 'itemDeploymentResult'

    Hi
    We are urgrading our SAP MII ( purely Java ) systems from 7.31 Sp5 to 7.40 and we got stuck at Configuration phase. We are getting the following error during Queue validation.
    Checking of deployment queue completed with error. java.lang.NullPointerException: while trying to invoke the method com.sap.sdt.j2ee.tools.deploymentmgr.DeploymentManagerItemResultIF.getDescriptions() of an object loaded from local
    variable 'itemDeploymentResult'.
    I looked in the trace files under SUM/sdt/trc directory and found the following information.
    Jul 3, 2014 9:52:16 AM
    [Error]:
    com.sap.sdt.executor.module.ModuleExecutor [Thread[UC-3,5,main]]: A
    problem has occurred:
    com.sap.sdt.executor.exception.StepExecutionException: Problem while
    trying to execute operation on service validate-queue. The following
    problem has occurred java.lang.reflect.InvocationTargetException. CSN
    Component BC-UPG-TLS-TLJ.
    Checking of deployment queue completed with error.
    java.lang.NullPointerException: while trying to invoke the method
    com.sap.sdt.j2ee.tools.deploymentmgr.DeploymentManagerItemResultIF.getDescriptions() of an object loaded from local
    variable 'itemDeploymentResult'
    Jul 3, 2014 9:52:16 AM
    [Error]:
    com.sap.sdt.executor.module.ModuleExecutor [Thread[UC-3,5,main]]:
    Checking of deployment queue completed with error.
    Jul 3, 2014 9:52:16 AM
    [Error]:
    com.sap.sdt.executor.module.ModuleExecutor [Thread[UC-3,5,main]]:
    java.lang.NullPointerException: while trying to invoke the method
    com.sap.sdt.j2ee.tools.deploymentmgr.DeploymentManagerItemResultIF.getDescriptions() of an object loaded from local
    variable 'itemDeploymentResult'
    We have 26 GB left on our Drive and have also upgraded JSPM to 7.31 Sp8. We are using SUM10SP05_9.
    Appreciate your quick help

    Hi Amarnath,
    Where exactly you are getting this error?
    If you are getting at JMS Sender communication channel, try to stop and start the JMS communication channel and see the status, also use XPI Inspector to get the exact error log.
    for reference follow below blogs:
    Michal's PI tips: ActiveMQ - JMS - topics with SAP PI 7.3
    Michal's PI tips: XPI inspector - help OSS and yourself
    XPI Inspector

  • MQ Adapter does not clear the rejected message from the queue

    Hi All,
    I'm using a MQ Adapter to fetch the message from the queue without any Backout queue configured. However, whenever there is any bad structured message found in the queue, MQ adapter rejects the message and moves the message to the rejmsg folder but does not clear it off the queue, as a result of which it keeps retrying the same hence, filling the logs and the physical memory. Somehow we do not have any backout queue configured so I can move the message to blackout queue. I have tried configuring the jca retry properties and global jca retry as well but to no avail.
    - Is it not the default behaviour of MQ Adapter to remove the rejected message from the queue irrespective of Backout queue is configured or not? The same behaviour working well with the JMS and File Adapter though.
    - Is there any way I can make MQ Adapter delete the message from that queue once it is rejected?
    Regards,
    Neeraj Sehgal

    Hi Jayson,
    Check this URL which answers a problem with com.sap.engine.boot.loader.ResourceMultiParentClassLoader problem:
    http://209.85.175.132/search?q=cache:RnFZ9viwuKkJ:https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/pcd!3aportal_content!2fcom.sap.sdn.folder.sdn!2fcom.sap.sdn.folder.application!2fcom.sap.sdn.folder.roles!2fcom.sap.sdn.folder.navigationroles!2fcom.sap.sdn.role.anonymous!2fcom.sap.sdn.tln.workset.forums!2fforumtest!2fcom.sap.sdn.app.iview.forumthread%3FQuickLink%3Dthread%26tstart%3D45%26threadID%3D1020700+com.sap.engine.boot.loader.ResourceMultiParentClassLoader&hl=en&ct=clnk&cd=3&gl=in&client=firefox-a
    Please check that the JDK compliance level is at 5.0
    Window->Preferences->Java->Compiler->Compiler compliance level set this to 5.0
    Set the installed JRE to the one you have mentioned JDK 5.0 update 16
    Window->Preferences->Java->Installed JRE's->
    Click on the add button to select the path of your JDK.
    once completed click on the check box next to it.
    regards,
    AKD

  • Request data again from delta queue to PSA , but error

    Hi,expert
    I extract 1 record from delta queue to PSA,it is success, and then I set request to 'Red' in the PSA monitor, and delete that request,and start infopackage again ,
    then 1 delta record was loaded, so, it is success.
    but,
    when I repeat the upon operate(set request to 'red',and request data again), the system then alert below error:
    Last delta update is not yet completed Therefore,
    no new delta update is possible You can start the request again if the last delta request is red or green in the monitor (QM activity)
    why?
    Edited by: xuehui li on Apr 27, 2010 1:44 PM

    Hi ,
    If your last delta was red only then system will ask to repeat the same request it is called Repeat Delta,
    if you make your request Green then you won't get the message because this then assume that your last delta was successful ,
    But if you don't want to loose your data in DELTA process, then you must do Repeat Delta ,
    if you have very less data in your ODS and you are not sure about what is updated then
    you may delete all the data from ODS and delete the INIT request and initialise the load again.
    Let me know if you have any doubt,
    Cheers,
    Sukhi

  • Footage changes once color project is loaded from disk

    Hi all,
    I have a big problem in Color 1.0.4.
    I send a 20 minute edit from FCP to Color. All is fine. The edit is the first part of a marriage service. A shoot taken with two camera.
    I did originally put the sequence together using Multi-Cam, but then I replaced all the clips with the originals as I had problems when I first went into color with this.
    The problem is that one of the angles appears to have lost the information as to where the INOUT points are...This happens once the color project has been loaded from disk. (ALL is fine until I shut Color and reload the project). The sections in the sequence where this camera is used always plays the source file for that camera from the beginning. (A shot from outside the church!) This is causing me all sorts of problems and I cannot seem to get past this. I need to save the work and then re-open in FCP to see what the grade looks like as I don't have an external monitor.
    Any help with this would be appreciated.
    The thumbnails show the correct footage from the camera. Adding a section to the render queue results in a render of 1 frame. Even though the IN and OUT points on the Color timeline show roughly 20 seconds, the render view shows an in and out point one frame apart...
    Thanks in advance,
    Richard
    Message was edited by: richyN

    What is/are your sequence codec/settings?
    I know I get this sort of behavior on my external monitor if I've got a Kona export settings mismatch, but not on the timeline.
    When you observe the application running back to "first frame" on park, but finds the correct images on play, sort of infers that there is a broken reference somewhere in how FCP is recalling the media.
    Bad container? frame rate mismatch?
    But before you re-edit, can you save the effort to this point by "Baking out" a movie of the sequence?
    It would be at least interesting to know if QT finds the right media at your edit points. At that point, a fairly straightforward movie could be razored and sent to COLOR...
    jPo

  • How to get the XML messages from JMS Queue in BPM

    I have one requirement in my application.we are sending XML messages to the JMS Queue.How to get the XML messages from JMS Queue and how to Extract the details from XMl.
    can you please send me the code to get the XML messages from the JMS Queue.
    Thank you,

    Hi,
    Sure others will have some other ideas, but here's what I typically do to get the XML from a JMS queue. Inside the Global Automatic that pops the messages off the queue you'd have logic similar to this:
    artifactInfoNodes as Any[]
    xmlObject as Fuego.Xml.XMLObject = XMLObject()
    load xmlObject using xmlText = message.textValue
    . . . Once you have this, it's a matter of deciding what you want to do with the message. Most times you'll parse the XML (using XPATH statemens), set argument variables and create a work item instance.
    Hope this helps,
    Dan

  • Got JBO-25080: Definition name does not match the file it is loaded from

    We are using jdev 11.1.1.5.0
    We got the following error when running ADF application on Windows 2008 platform
    [2011-08-26T23:57:53.728-05:00] [AdminServer] [WARNING] [] [oracle.adf.controller.faces.lifecycle.Utils] [tid: [ACTIVE].ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: <anonymous>] [ecid: 87d74ec2641f19b2:21e0e8f6:131ffbcf000:-8000-0000000000000afc,0] [APP: grc863] ADF: Adding the following JSF error message: Definition name: Survey does not match the file in which it is loaded from: oracle.apps.grc.domain.survey.survey[[
    oracle.jbo.NoDefException: JBO-25080: Definition name: Survey does not match the file in which it is loaded from: oracle.apps.grc.domain.survey.survey
         at oracle.jbo.mom.DefinitionManager.loadLazyDefinitionObject(DefinitionManager.java:1127)
         at oracle.jbo.mom.DefinitionManager.loadParent(DefinitionManager.java:1353)
         at oracle.jbo.mom.DefinitionManager.loadLazyDefinitionObject(DefinitionManager.java:1061)
         at oracle.jbo.mom.DefinitionManager.findDefinitionObject(DefinitionManager.java:562)
         at oracle.jbo.mom.DefinitionManager.findDefinitionObject(DefinitionManager.java:484)
         at oracle.jbo.mom.DefinitionManager.findDefinitionObject(DefinitionManager.java:466)
         at oracle.adf.model.generic.AccessorDefImpl.getStructureDefImpl(AccessorDefImpl.java:217)
    However, we are able to run application fine on linux platform.
    We are aware of the bug https://bug.oraclecorp.com/pls/bug/webbug_edit.edit_info_top?rptno=8746171
    We changed the UI classes to avoid that issue but we did not change our backend classes.
    Thanks

    Looks like you are an Oracle employee as the link you provided can't be reached from the outside.
    If this is correct, you should use some kind of internal communication to address your problem, as we the normal users don't have access to the internal stuff.
    Timo

  • Getting all values from a queue

    Hi all,
    I have a queue as follows:
    [123] - intital Context
    [sollinger Str]
    [123] - CC
    [Solinger Str]
    [123] - Final context
    An image of the queue looks as follows:
    [http://i42.tinypic.com/hsj052.jpg]
    Iam getting this queue in a userdefined function. I want to get or copy all the values from this queue to a new array as follows without any contexts.
    That is, I want to have a queue as follows from the above queue.
    [123]
    [sollinger Str]
    [123]
    [Solinger Str]
    [123]
    How could that be done...
    Thanks
    P

    Hi guys...thanks 1 more thing
    If My inbound queue "a" is as follows:
    SUPRESS
    AP
    YG
    LF
    contextChange
    YG
    LF
    Final Context
    In my Queue "b" which is as follows
    SUPRESS
    123
    sollinger...
    123
    Sollinger...
    123
    FinalContext
    My requirement is in my "b" queue , I want to put a CC as follows by using a  standard node function if possible
    SUPRESS
    123
    sollinger...
    123
    contextChange
    Sollinger...
    123
    FinalContext
    IF THAT dont work Inside my UDF I want to put thE CC for the b queue.
    How will the loop behave then?
    What will be passed to my input queue "a" and "b" when execute "all values on context" is set.
    "a" has a context and "b" doent have one.
    My UDF
    public static void Test(String[] a,
                                           String[] b,
                                           ResultList result,
                                           Container container){
    //Contains no context change
        String b[] = {ResultList.SUPPRESS,"12","sollinger","12","sollinger","12",
                      ResultList.CC};
    I make the following steps
    copy to an arraylist. add a context change in between.
    copy back to string b.

  • Chinese characters scrambled when loading from DS to BW

    Hi, I've been pulling my hair out with this issue.
    I have a flat file containing Chinese text. When I load this in BW using 'FLATFILE' as a source system, it works fine. BW shows the correct Chinese characters.
    When I do the same load using BODI, I get funny characters.
    When I use BODI to load from one flat file into another flat file, the Chinese characters remain correct.
    What do I need to do to make sure I get the right Chinese characters in BW when loading from BODI?
    BODI is installed on Unix on Oracle 10.
    I run the jobs as batch processes.
    The dsconfig.txt has got:
    AL_Engine=<default>_<default>.<default>
    There are no locale settings in al_env.sh
    BW target is UTF-8 codepage.
    File codepage is BIG5-HKSCS
    BODI is set up as a Unicode system in SAP BW.
    When loading flat file to flat file, I get a message:
    DATAFLOW: The specified locale <eng_gb.iso-8859-1> has been coerced to <Unicode (UTF-16)
    because the datastore <TWIN_FF_CUSTOMER_LOCAL> obtains data in <BIG5-HKSCS> codepage.
    JOB: Initializing transcoder for datastore <TWIN_FF_CUSTOMER_LOCAL> to transcode between
    engine codepage<Unicode (UTF-16)>  and datastore codepage <BIG5-HKSCS>
    When loading to BW the messages are almost the same, but now the last step in UTF-16 to UTF-8.
    I read the wiki post which definitely helped me to understand the rationale behind code page, but now I ran out of ideas what else to check ( http://wiki.sdn.sap.com/wiki/display/BOBJ/Multiple+Codepages )
    Any help would be greatly appreciated.
    Jan.

    Hi all. Thanks for the Inputs. This is what I got when I clicked on the Details Tab of the Monitor....
    Error when transferring data; communication error when analyzing
    Diagnosis
    Data packages or InfoPackages are missing in BI but there were no apparent processing errors in the source system. It is therefore probable that there was an error in the data transfer.
    The analysis tried to read the ALE outbox of the source system. This lead to error .
    It is possible that there is no connection to the source system.
    Procedure
    Check the TRFC overview in the source system.
    Check the connection to the source system for errors and check the authorizations and profiles of the remote user in both the BI and source systems.
    Check th ALE outbox of the source system for IDocs that have not been updated.

  • Full load from a DSO to a cube processes less records than available in DSO

    We have a scenario, where every Sunday I have to make a full load from a DSO with OnHand Stock information to a cube, where I register on material and stoer level a counter if there is stock available.
    The DTP has no filters at all and has a semantic group on 0MATERIAL and 0PLANT.
    The key in the DSO is:
    0MATERIAL
    0PLANT
    0STOCKTYPE
    0STOR_LOC
    0BOM
    of which only 0MATERIAL, 0PLANT and 0STORE_LOC are later used in the transformation.
    As we had a growing number of records, we decided to delete in the START routine all records, where the inventory is not GT zero, thus eliminating zero and negative inventory records.
    Now comes the funny part of the story:
    Prior to these changes I would [in a test system, just copied from PROD] read some 33 million of records and write out the same amount of records. Of course, after the change we expected to write out less. To my total surprise I was reading now however 45 million of records with the same unchanged DTP, and writing out the expected less records.
    When checking the number of records in the DSO I found the 45 million, but cannot explain why in the loads before we only retrieved some 33 millions from the same unchanged amount of records.
    When checking in PROD - same result: we have some 45 million records in the DSO, but when we do the full load from the DSO to the cube the DTP only processes some 33 millions.
    What am I missing - is there a compression going on? Why would the amount of records in a DSO differ from the amount of records processed in the DataPackgages when I am making a FULL load without any filter restrictions and only a semantic grouping in place for parts of the DSO key?
    ANY idea, thought is appreciated.

    Thanks Gaurav.
    I did check if there were more/any loads doen inbetween - there were none in the test system.  As I mentioned that it was a new copy from PROD to TEST, I compared the number of entries in the DSO and that seems to be a match between TEST and PROD, ok a some more in PROD but they can be accounted for. In test I loaded the day before the changes were imported to have a comparison, and between that load and the one ofter the changes were imported nothing in the DSO was changed.
    Both DTPs in TEST and PW2 load from actived DSO [without archive]. The DTPs were not changed in quite a while - so I ruled that one out. Same with activation of data in the DSO - this DSO get's loaded and activated in PROD daily via process chain and we load daily deltas into the cube in question. Only on Sundays, for the begin of the new week/fiscal period, we need to make a full load to capture all materials per site with inventory. The deltas loaded during the week are less than 1 million, but the difference between the number of records in the DSO and the amount processed in the DataPackages is more than 10 millions per full load even in PROD.
    I really appreciated the knowledgable answer, I just wished you would pointed out something that I missed out on.

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

  • Data load from DSO to cube

    Hi gurus
    We have a typical problem, we have to combined 2 records into one when they reach to DSO.
    One source is flat file and the other is R3. so few fields I am geting from flat file and few from R3 . They are creating one record when they loaded to DSO . Now I am getting one record in active data table but when I load the data to cube from that DSO ( Data is going from change log to cube ) to cube , I am getting 2 seperate records ( one which is loaded from flat file and one which is loaded from R3 ) in cube which I dont want. I want only one record in cube just like i have in my active data table in DSO.
    I cant get the data from Active data table because I need delta load.
    Would you please advise what can I do to get that one record in Cube?
    Pl help

    Ravi
    I am sending the data thro DTP only but is there any solution to get one record bcos in other scenario I am getting data from 2 different ERP sources and getting one record in DSO and cube as well.
    but that is not happening for this second scenario where i am getting data from FLAT file and ERP and traing to create one record.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Data loading from DSO to Cube

    Hi,
    I have a question,
    In book TBW10 i read about the data load from DSO to InfoCube
    " We feed the change log data to the InfoCube, 10, -10, and 30 add to the correct 30 value"
    My question is cube already have 10 value, if we are sending 10, -10 and 30 Values(delta), the total should be 40 instead of 30.
    Please some one explaine me.
    Thanks

    No, it will not be 40.
    It ll be 30 only.
    Since cube already has 10, so before image ll nullify it by sending -10 and then the correct value in after immage ll be added as 30.
    so it ll be like this 10-10+30 = 30.
    Thank-You.
    Regards,
    Vinod

Maybe you are looking for

  • Is this the only way to file a complaint?

    I upgraded my Verizon internet service a few months ago, because my connection speed was very slow. I observed that it wasn't getting better and today it was actually worse it's ever been. While Verizon seems to be putting all their efforts into FiOS

  • Unable to open webpage in Safari, Firefox, Explorer, Camino, ec..

    I'm almost at my wits end here. first off here is the site I need to connect to: www.eyemedvisioncare.com not only this address but all pages connected to this URL do not load. The message I get is "Safari can't open the page. Safari could not open t

  • Shopping cart : how to determine the timeout

    Hello guys, Where can I know (and modify) the Shopping cart Timeout? Thanks for your input, Regards, Thomas

  • Migrating MySQL database to Oracle 11g RAC

    Hi, I would like to migrate an entire database from MySQL to my new Oracle 11g RAC. How do I do it?

  • PC to Mac - slower internet speed?

    I have just purchased my first mac (entry level imac 17"). I love the computer and the operating system. However, the internet speed is extremely slow (almost like dial up). I have a pc and a mac, both running off the same connection. The mac is sign