Use of logbook or PM in high volume context

Hello,
is "logbook" suitable for managing big volume of data (i.e. 3 to 10 millions of events like measurements per day) ?
An idea could be to create a log entry per event of this type and only log measurement(s) when relevant and necesary ?
Any insight on use of classical "measurement document" in PM (without logbook) is also welcome
Kind Regards
Eric

Hello Narasimhan,
Thanks for your feedback
We need to collect events related to Fleet (truck) activity : Kilometer, taxes to be collected, location (etc .)
Information needs to be checked (Truck N°) and status managed
Notifications have to be sent according to rumes
Then a weekly / monthy sum-up is done to be passed to several CRM systems (non SAP)
I see 3 possibilities :
. classical use of PM measurement documents (to be extended with additional info, in addition to Kilometer)
. use of log book
. use of IS-U counters
Context is high volume : 3 to 10 millions events per day
This information needs to be archived and retrieved when necessary
This information needs to be on-line for around 3 months
Any suggestion will be welcome
Kind Regards
Eric

Similar Messages

  • Anyone have experience using ALE message FIDCC2 in a high volume scenario?

    Hi,
    We are evaluating the use of message type FIDCC2 to send complete FI documents from one SAP system (logisitics) to another SAP system (finance).  I'm particularly interested in throughput performance, as this will be a high volume scenario (300K+ docs per month).  Additionally, each SAP system has a customer include code block extension (CI_COBL) to BSEG, that will be carried via an IDOC enhancement (custom segment), for which we are considering using XI to map the IDOCs between the two systems.  I would appreciate any info you can share regarding use of FIDCC2 in a high volume scenario.  Thanks in advance.
    Regards,
    - Allen

    Hi Allen,
    In my past experiences, I used the message variant (EDIDC-MESCOD) and message function (EDIDC-MESFCT) to execute multiple inbound IDoc jobs to run parallel  for a given message type and IDoc type (execute program RBDAPP01 in different application servers).
    I used message variant and message function to group all sales order IDocs based on location and customer numbers. 
    You need to enter message variant and/or message function during partner profile setup (WE20). Also the middleware system (i.e. SAP XI) must maintain the above fields accordingly for IDoc inbound processing.
    Hope this will help.
    Regards,
    Ferry Lianto

  • What kind of throughput should I expect? Anyone using AQ in high volume?

    Hi,
    I am working with AQ in a 10.2 environment and have been doing some testing with AQ. What I have is a very simple Queue with 1 queue table. The queue table structure is:
    id number
    message varchar(256)
    message_date date
    I have not done anything special with storage paramteres, etc so it's all defalt at this point. The I created a stored procedure that will generate messages given message text and number of times to loop. When I run this procedure with 10,000 iterations it runs in 15 seconds (if I commit all messages at the end) and 24 seconds if I commit after each message (probabliy more realistic).
    Now, on the same database I have a straight table that contains one column (message varchar(256)). I have also created a similiar storage procedure to insert into it. For this, 10,000 inserts takes about 1 second.
    As you can see there is an order of magnitude of difference so I am looking to see if others have been able to achieve higher throughput than 500-700 messages per second and if so what was done to achieve it.
    Thanks in advance,
    Bill

    Yes, I have seen it. My testing so far hasn't even gotten to the point of concurrent enqueue/dequeue. So far I have focused on enqueue time and it is dramatically slower than a plain old database table. That link also discussed mutliple indexed organized tables being created behind the scenes. I'm guessing that the 15X factor I am seeing is because of 4 underlying tables, plus they are indexed organized which adds additional overhead.
    So my question remains - Is anyone using AQ for high volume processing? I suppose I could create a bunch of queues. However, that will create additional management on my side which is what I was trying to avoid by using AQ in the first place.
    Can one queue be served by multiple queue tables? Can queue tables be partitioned? I would like to minimize the number of queue so that the dequeue processes don't have to contain multiplexed logic.
    Thanks

  • High volume Printing for GLM?

    I understand from various sources GLM can support high volume label output provided the correct enhancements and support packs are installed and configured.  This may include updates to WWI.  We are currently running ECC6 EHP5, SAP_BASIS 10, EHSM 3, WWI 2.7
    I would like to get this communities inputs.  Our requirement is to generate >2000 identical labels via GLM.  WWI processing today is long (5+min) and the output is large.  Ideally system would generate 1 label with indication to print 2000 times.  Our labels include barcodes and symbols and can be complex.

    Dear Richard,
    The functions of GLM is well explained as Christopher said and using GLM, you can print labels more than 10000 witha  sequential data output. To support such large quanitity of data to be printed on labels, you may need a special printers like the Zebra High volume printer. The WWI server is equipped to print large volume of labels using specific printer plugins.
    High volume printers can be used to print print requests with print files that are too large or that contain more than 32,768 labels to be printed. The HVP is designed as a printer driver for Microsoft Windows and is connected to the label printer via a plug-in.
    With the HVP, only one page with all static data is sent to the printer. The HVP then receives all of the sequential data via an interface and automatically supplements the sequential data in the printout. The HVP also integrates changing bar codes or texts in the printout.
    The Zebra 170Xi4 is one of most popular, industrial-strength printers on the market. This rugged metal unit prints 1-color labels up to 6.6" wide, with 300 dpi print resolution in thermal or thermal transfer mode. Its extra processing power equates to speeds up to 12 inches-per-second. This one is perfect for tough applications including compliance labels, product labels, and shipping labels.
    For further information, please check the links below
    EHS - Continuous Improvement for Global Label Management - Logistics - SAP Library
    OSS notes in Global Label management
    New changes for GLM in EHP7.0 and ERP 6.0
    Dhinesh

  • Tool to export and import high volume data from/to Oracle and MS Excel

    We are using certain reports (developed in XLS and CSV) to extract more than 500K to 1M records in single report. There around 1000 reports generated daily. The business users review those reports and apply certain rules to identify exceptions then they apply those corrections back to the system through XL upload.
    The XL reports are developed in TIBCO BW and deployed in AMX platform. The user interface is running on TIBCO GI.
    Database Version: Oracle 11.2.0.3.0 (RAC - 2 node)
    The inputs around following points will be of great help:
    1) Recommendation to handle such higher volumes reports and mechanism to apply bulk correction back to system?
    2) Suggestions for any Oracle tool or third party tool

    If you were to install Oracle client software on the PC where EXCEL is installed,
    then you can utilize ODBC such that Excel can connect directly to the DB & issue SQL.

  • When I have headphones plugged into my iphone it says warning'' high volume'' and becomes first a red and then plop a number of red pops but it will not be so for someone else I know. My sound will be much lower than anyone else in the music, and when I t

    When I have headphones plugged into my iphone it says warning'' high volume'' and becomes first a red and then plop a number of red pops but it will not be so for someone else I know. My sound will be much lower than anyone else in the music, and when I talk by phone headset

    Hi! I have the same problem when I use my headphones .
    iPhone 4s England, iOS 6.1.2

  • Sender RFC adapter High volume messaging

    Hi,
    This question is related to this thread:RFC connection problem
    ERP system is sending through 1 RFC dest. (program ID) 20 requests in a minute. And PI starts to hang. ERP is not able tp sent the messages and after a while the request sent from ERP starts to get cancelled. This is a synchonous scenario. How can I handle such a high volume through 1 sender RFC adapter?

    Hello
    You can monitor the load on RFC adapter queues/threadsin the RWB
    -> Component Monitoring
    -> Adatper Engine XIP
    -> Engine Status
    -> Additional Data
    See note #791655 Documentation of the XI Messaging System Service Properties, for an explaination of the queues.
    To increase the number of threads/queues, see the blog:
    1) /people/kenny.scott/blog/2007/08/20/messaging-system-queue-properties-after-xi-30-sp19-xi-70sp11
    2) /people/kenny.scott/blog/2008/12/05/xipi-file-and-jdbc-receiver-adapter-performance-and-availability-improvements - this shows how to prevent a problem on one RFC channel blocking other RFC channels that you may be using.
    Also, ensure note #937159 XI Adapter Engine is stuck, has been applied to help overall system performance.
    Regards
    Mark

  • Question on QoS for File-- SOAP synchronous high volume interface

    Is it possible to have a File to SOAP (sender file adapter --> receiver SOAP adapter, synchronous - response needed by sender ) scenario with FIFO (first in first out). This is a high volume interface (about 8 messages/second). If FIFO possible, will a failed message block the succeeding messages?
    My understanding is that QoS of receiver SOAP CC is Best Effort.
    Thank you.

    yes..EOIO will block the queues if its predecessor are not in final state...
    for ur design u can rfer the below wiki...instead of RFC u can use soap adapter..
    http://wiki.sdn.sap.com/wiki/display/XI/File-RFC-File%28Without%20BPM%29

  • How to prepare high volume segmentation on crm and trex 7.1

    Hi,
    I have trex 7.1 connected to our crm system.
    I have set up esh on the web ui.
    Now I would like to use high volume segmentation.
    So therefore i have created a datasource in crmd_mktds on both an "Attribute Set" and InfoSet.
    Then I have created attribute list for high volume segmentation.
    But I can't see the columns indicating that fast find is in use.
    Our CRM system is:
    SAP_ABA = SAPKA70106
    kr
    Michael Wolff
    Update:
    I solved this.
    I forgot to define RFC destination for TREX index and fast find under marketing -> segmentation.
    So now everything works
    Edited by: Michael W. Knudsen on Oct 21, 2010 6:59 AM
    Edited by: Michael W. Knudsen on Oct 21, 2010 7:00 AM

    Did the missing cutomizing, thx Willie for directing me in the correct direction

  • HT6154 HI , I am still under cotract with my iphone 5 and i can't hear anything even though i have my phone set to high volume

    i am using iphone 5 and still under contract with ATT services. the phone has suddenly some technical issue. when i call some one or some one calls me, if i i turn on th emusic anything that hass to do with listening to the device, it doesnot work
    i hear very very low voice eventhough it is set to high volume. will ATT will provide me free support for the phone?

    Only ATT can answer that.

  • HT3887 how can i configure my wireless keyword so i can use the Function F keys to control volume etc...? I am using 10.5.8 OS

    how can i configure my wireless keyword so i can use the Function <F> keys to control volume etc...? I am using 10.5.8 OS

    little wireless cameras would not have the ability to send back to the base, even thought the base can reach it.
    Now if you take regular cameras and wire them into a Cisco switch that was attached to a wireless bridge that could shine back to the base...now you might have something, but your still going to have to do a survey/path analysis to determine if it is feasible.
    Coverage isnt throughput. I see one base station with 3 or 4 sectoral antennas depending on direction, pointing two several distribution bridges to bridge the gaps that go out again to the cameras...something like that. And all of it should be higher in the air than your construction.

  • High volumes on receiver JDBC adapter

    Hi,
    We have a RFC ->JDBC scenario where the RFC pulls huge amounts of data from R/3 and sends to XI.
    XI needs to upload this data into 5 different Db tables.Each table contains 3000-8000 records with each record containing 10-15 fields.
    When we try to run this scenario, due to high volumes of data  the JBDC adapter hangs and msgs were in 'holding/delivering' status for long time.
    Please advice on possibilities of handling this within XI.

    Hi,
    We changed the design and now we have only 'INSERT' and we don't have concerns with table refresh now.
    I am splitting the records in XI mapping as bunches on 1000 each. But I found one of the tables have more that 1lakh records.
    The data volume that we received in RFC is 150000 records(45MB). It took 7.5 mins to process this msg in Integration Engine.
    But the messages delivery into Db tables (receiver JDBC adapter processing) is very slow.At maximum it can process 250 records in minute.
    Please provide your inputs on this design. Is it Ok to accept 45MB message into XI at one shot? Even though the message got processed(splitted) in IE, they are processing in AE for long time. I believe this will have impact on other interfaces that use JDBC adapter.
    Please provide your suggestions on how to improve the design/performance of this interface.
    Thanks!

  • CCM 2.0 - Files Storage for high volume of files

    Hi all,
    I have seen in one message the following information:
    <i>1. Files storage.
    My point is that I think you've created a virtual folder on the SRM Server in SICF, which means that all your files are stored internally in the database, and not physically on the server.
    The most simple way is to create a physical folder on the server OS, and then create analias in SICF to this folder. You then will be abble to load in mass the pictures on the server using FTP, or network Share.</i>
    I want to upload images in the Catalog (CCM 2.0) and in my case I have high volume of files. Where is the best place to stored this data, in the database or in the server?
    And if it is in the server, how I can create an alias?
    Many thanks!!
    Regards

    Hi ,
    What we had done for image upload for CCM 2.0 was like this:
    1. in SE80 go to MIME repository -> drill down to services -> bc /sap/bsp - >Create a personal folder
    2. Import your image .jpeg in this folder.
    3. Derive an URL with the structue : server name/domain name/services/file name
    4. test this URL in IE browser ,it should open the picture in IE for you.
    5. then paste this URL in the characteristic 'image' of an item in master catalog in CAT
    with this in EBP we could see the photos of the items.
    BR
    Dinesh
    reward if helps

  • SQL question in a high volume web app environment

    Hi,
    We have a high volume website used in a hospital setting. When a doc selects a patient, we have to delete all previous patient history records for the patient from a certain table and insert new records retrieved from a web svc called at runtime during patient selection.
    Our 2 options are:
    Option1:
    1. Call Web svc, and always insert the results into pt_history, even if this creates duplicates (which it normally shouldn’t, if we only call every 24 hours)
    2. Only pt claims_history records that are less than 24 hours old from app method calls
    3. Nightly batch job cleans out records older than 24 hours
    Option2:
    1. Delete all existing history recs for a patient each time before calling web svc
    2. Call web svc, and always insert the results into pt_history (no chance of duplicates here)
    3. No change to existing methods
    4. No nightly batch job
    I'm leaning to option 1 to avoid contention\locks\deadlocks on the table. Any other solutions out there?
    Also any insights into which may be a better option this being a high volume transaction (several 10s of calls every sec) and table could grow to be oretty large (2 million recs and more)
    Thanks,
    JGP

    1) Why do you need to "delete all previous patient history records"? That seems contrary to the point of having patient history records...
    We want to keep these records only for 24 hrs. Older than 24 hrs is not needed.
    2) If you are just using this table to cache results from the web service call, could you use a temporary table?
    We wan to cache across all sessions for 24 hrs. So temp table would not work.
    Thanks,
    JGP

  • How to add Marketing attributes in High Volume Mail Form

    HI Experts,
    I have a requirement for My client.
    When i am executing a high volume campaign i am using a high volume Mail form.
    When i tried to include a Marketing attribute into the mail form it doesnot allow to add me Marketing Attributes in High volume mail forms.
    Do we have any badis or enhancement where we can hook up our own code so that the campaign picks the Marketing attributes which we specify?
    Please reply me .
    Thanks Regards,
    Sandipan Jena

    Hi Wille,
    How do i know if the badi is getting triggred and at what point.
    Is it triggered when we save the campaign or when we start the campaign.
    I am not able to debugg the badi my execution is happing in back ground.I tried to debugg the job through JDBC but i am not able to reach the BADI which i newly implemented.
    Also i wrote an infinte loop in side the badi but my execution didn't went to infinite loop rather the campign executed sucessfully.
    Can you help me ?
    Thanks Regards,
    Sandipan Jena

Maybe you are looking for

  • Batch Management in different plants

    Hey SAPers I am using batch managment for certain materials. But I don't need batch management of same materials in other plant. SAP is not allowing to change the status in other plants. Business needs to maintain materials in batches in warehouse bu

  • New external hard drive doesn't back up

    Hi All, So I had an email from Apple stating my hard drive might be faulty and they will replace it free of charge. So, I went out and purchased a brand new external hard drive to use as my time machine back up. I formatted my hard drive to use only

  • Acrobat Pro XI and ScanSnap iX500

    I use a ScanSnap iX500 on Mac running OS !0.10.2. The ScanSnap uses an ABBYY built-in ocr engine and it generates an excellent ocr layer. I can see the OCR layer in other apps and I know it is excellent. And other apps that handle pdfs (PDFpenPro and

  • Web-service client in JDeveloper 10.1.3.3.0 -- No Deserializer found...

    Hi, I'm trying to connect to a web-service that has a wsdl on this url: http://shimmer.oslo.dnmi.no/metdata/MetDataService?WSDL I can reach this site from a browser. In the Connection Manager I create a new 'UDDI Registry' connection, and type in the

  • Using OracleXMLSave for nested date

    I am trying to insert an XML document containing the following date structure into Oracle using the OracleXMLSave class. Is there a way to do this without reformatting the date into a single element beforehand? Here is the stucture. <expiration-date>