Historical data from ACE20 load balancer modules

Hello,
I am trying to get some historical data from our ACE20 load balancer modules which are housed in a 6504, but cannot seem to find where to get this information from. I would like to get information such as memory used, CPU active connections, throughput, http requests etc over the past month, is there any way that I can do this? I am using ANM 2.0
Thanks

Hello,
Below are teh steps to load from DB connect.
Just go to Rsa1->modeling->source system->double click on the source system. On right habd side side right click on the topmost node.now it will take u to a screen, there u can give the table name (if u know) or just execute. it will display all the tables u have in Oracle. there double click on th eselected table and select the fields requared and now generate data source. now come back to RSA1->source system-> replicate the data source. now assign infosouce as usual.And create infopackage and start the load.
And this doesn't support delta process.
This document could help:
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/2f0fea94-0501-0010-829c-d6b5c2ae5e40
Regards,
Dhanya

Similar Messages

  • HOW TO TRANSFER HISTORICAL DATA FROM ONE ACCOUNT TO ANOTHER

    제품 : FIN_GL
    작성날짜 : 2006-05-29
    HOW TO TRANSFER HISTORICAL DATA FROM ONE ACCOUNT TO ANOTHER
    =============================================================
    PURPOSE
    특정 기간의 Balance 를 Account 별로 Transfer 하는 방법에 대해 알아 보도록 한다.
    Explanation
    GL 의 Mass Maintenance 기능을 이용하면 한 Account 에서 다른 Account 로 혹은 Multiple Account 에서 다른 하나의 Account 로 Balance 를 이동 시킬 수 있다.
    1. GL Responsibility 에서 Other> Mass Maintenance 를 선택한다.
    2. Move/Merge 작업을 위한 Request Name 과 Description을 입력한다.
    3. Request Type 으로 Move 혹은 Merge 를 선택한다.
    4. source-to-target account 를 위해 line number 를 입력한다.
    5. LOV 에서 source Account 를 선택하여 입력한다.
    모든 Account 는 enable 상태여야 한다.
    6. target account 역시 LOV에서 선택하여 입력한다.
    7. 지정한 작업을 수행 하기 전에 먼저 확인 작업을 할 수도 있다.
    8. 작업한 내용을 저장한다.
    9. Move/Merge request 를 수행한다.
    Example
    N/A
    Reference Documents
    Note. 146050.1 - How to Transfer Historical Data from One Account to Another

    Follow the directions here:
    http://support.apple.com/kb/HT2109

  • Exporting Historical Data from Lookout Version 3.8 to Excel

    Can anyone please give step-by-step instructions on how to export historical data from Lookout Version 3.8 to an Excel Spreadsheet or CSV file?

    Hi koehler, in the following link you can fiond a detailed step by step instructions
    http://digital.ni.com/public.nsf/websearch/5915049F3BF93935862568C5005E9DF3?OpenDocument
    Benjamin C
    Senior Systems Engineer // CLA // CLED // CTD

  • Copy historical data from one material to another

    I have added a new material in the planning hierarchy and need to copy historical data from an existing material to this new material.
    For achieving the same I carried out below steps:
    In the forecasting view of the new material I added the existing material as reference material: consumption, also added the reference plant, validity period and multiplier.
    But then too system is not copying the historical data of that material to the new material..I cannot see any data in MC94 or in the infostructure.
    Please let me know where am I going wrong or if I am missing any setting.....
    Thanks in advance.
    Regards,
    Sonal

    Dear ,
    System will not copy the Reference material consumption to new material consumption -Total Consuption tab .If  you maintain those filed like
    1.Reference material consumption
    2.Ref.Plant Consumptuion
    3.Date To
    4.Multipler
    5.Similar Period Indicator -M/W ( some time if you maintain the frist 4 option but not same period indicator , system will not consider the consumption ) .
    Based on the above set up  , when you execute the Forecast ( Indiviaully -MM02-Forcaste Tab or MP38 collectively or through SOP) , forcast programme will use the  historical values of Refe.Material and project  the  new material\s forcasted demand as per your forecast period ( 2 months or 3 motnhs etc )  in the forecasting view .
    System will not copy the Historical valu to total consumption tab of the new material ok .
    If you need to copy the  consumption of old material to new material , you have to use report RMDATIND, MVER_DI or BAPI_MATERIAL_MAINTAINDATA_RT .For more details please refer the OSS Note 200547 - Program: Direct input for consumption values
    More over why do you need to copy when u have facility to have refence in Forecasting view !
    Just chek the above points wht i have mentioned and execute forecast .Do not forget to keep same period indicaotor-M/W what ever .
    Test a sample and procced .
    Hope it is clear .
    Regards
    JH
    Edited by: Jiaul Haque on Jun 5, 2010 10:12 AM

  • Cant extract historical data from Citadel database

    When using the simplest VI's to extract historical data from the Citadel database, I get error messages :
    using "Get Historical Tag List.vi"
    CIT_OpenDatabase.vi
    error code - 0x8abc0010
    Using "Read Historical Trends.vi"
    CIT_ReadTrace.vi
    error code - 0x8abc0010
    I am using dsc version 6.0, and have already tried to upgrade to ver. 6.0.2. This did not go well, and I had to return to ver. 6.0 (reinstalling NT and everything else) to make my system run again.

    Download and install the latest release of Logos from ftp://ftp.ni.com/lookout/logos. Logos is the backbone drivers for the Citadel database. The most recent releases of Logos addresses issues that are similar to this.

  • Prevent data from being loaded into certain dimensions

    Is there any way to prevent data from being loaded into certain dimensions? For example, if I have 2 dimensions: Scenario and Date, I only want a given Scenario to accept data in certain date ranges.
    Thanks,
    Jason

    I'll assume you mean by load rules. If you are talking about sending data from Excel, then the first one only is the answer.
    There are two ways I can think of doing this.
    1. create filters with write access to certain intersections and use an Id associated to that filter to load the data
    2. Use selection/rejection criteria in the load rule to limit what can be loaded
    As a bonus
    3. If you are using SQL interface loads, set up a table with what data is allowed to be leded.

  • How to extract the historical data from R/3

    hi
    I am extracting data from R/3 through LO Extraction. client asked me to enhance the data source by adding field. i have enhanced the field and wrote exit to populate the data for that field.
    how to extract the historical data into BI for the enhanced field. already delta load is running in BI.
    regards

    Hi Satish,
    As per SAP Standard also the best way is to delete whole data from the cube and then load the data from set up tables as you have enhanced the data source.
    After data source enhancement it is supported to load normally because you don't get any historical data for that field.
    Best way is to take down time from the users, normally we do in weekends/non-business hours.
    Then fill the set-up tables; if the data is of huge volume you can adopt parallel mechanism like:
    1. Load set-up tables by yearly basis as a background job.
    2. Load set-up tables by yearly basis with posting periods from jan 1st to 31st dec of any year basis as a background job.
    This can make your self easier and faster for load of set-up tables. After filling up set-up tables. You can unlock all users as there is no worries of postings.
    Then after you can load all the data into BI first into PSA and then into Cube.
    Regards,
    Ravi Kanth.

  • How can I support a health check, from a load balancer?

    My company has load balancers which use health checks to determine if the end point is available for client traffic. The basic health check is a tcp ping, and will tell you if the device is on the network. The next level of health check is an http request. This request, and the response are static, you can’t create your own version of the request and response. The standard request is this:
         http://host:port/healthcheck/hc.html
    The standard response is this:
         “The server is available”
    I want to use the load balancer as part of my total deployment. The problem is that I am not seeing how to support this health check request and response in the MDEX engine. What I see is this request
         http://host:port/admin?op=ping
    Will return this response
         dgraph <host>: <port> responding at <day month year time>
    It is nice that there is a built in ping, but I am not able to make use of it. I am new to Endeca and still poking around. The dgraph process listens on a port set up in <…>/config/script/AppContext.xml
    <dgraph id="Dgraph1" host-id="MDEXHost" port="3281">
    <properties>
    <property name="restartGroup" value="A" />
    <property name="updateGroup" value="a" />
    </properties>
    <log-dir>./logs/dgraphs/Dgraph1</log-dir>
    <input-dir>./data/dgraphs/Dgraph1/dgraph_input</input-dir>
    <update-dir>./data/dgraphs/Dgraph1/dgraph_input/updates</update-dir>
    </dgraph>
    (I am not using the default port, as I only have an instance on a shared server and have to worry about port clashing. But that is a different thread.)
    In a standard tc Server install I can support this health check by doing this:
    * Create a directory named “healthcheck”, in the “webapps” directory.
    * Place a file name “hc.html” in that directory, which contains “The server is available”
    The one hack which comes to mind is to write a servlet which would be able to be a smart proxy for the load balancer health check. It would pass along any regular traffic to the MDEX engine. But if the request was a health check it would send “admin?op=ping” to the MDEX engine, and for a good response from the engine, create and pass back the correct response to the load balancer.
    Ideas, comments, flames, …
    Thanks

    Hi, we are using following String to test the MDEX ping response but we get the invalid version formation on dgraph.log -
    following is on F5
    GET /admin?op=ping HTTP/1.1/r/nHost:myhost.endeca.com:19000/r/nConnection:close/r/n/r/n
    Following gets logged on Dgraph.log
    WARN 09/05/12 05:30:03.799 UTC (1346823003799) DGRAPH {dgraph} Invalid version format in 'HTTP/1.1/r/nHost:myhost.endeca.com:19000/r/nConnection:close/r/n/r/n'
    Please let me know - if you have any suggestions to solve this issue.
    I know that it works from browser and wget from unix with following commands.
    wget http://myhost.endeca.com:19000/admin?op=ping - from unix command line
    from browser:
    http://myhost.endeca.com:19000/admin?op=ping
    Thanks,
    Ram

  • Cisco ACE20 Load balancing issues

    Dear All,
    I have a problem with the ACE 20 load balance
    To start with following is our architectural request flow:
    Load Balancer --> Webseal /(reverse proxy) --> HTTP Server --> Portal Server
    We have Hardware Load Balancer Cisco ACE20.
    When we access our portal from Webseal server it works totally fine without any issue, but when we access the same application using ACE we face the following issues:
    1) Some of the links on do not work. For eg: We have a link "subscribe" which points to https://intranet/abc/wps/portal/subscription , whenever we click on this link, the request is directed to https://intranet/abc/wps/portal i.e homepage
    2) URL redirection does not work We have some links which have a url forwarding or redirection for example when we open https://intranet/ef/quickplace it forwards the requests to https://intranet/ef/quickplace/Main.nsf?opendocument....., but this redirection fails and again the request is thrown to homepage i.e https://intranet/abc/wps/portal
    3) The response of the request and the overall portal when accessed via ACE is very sluggish and it takes 20 seconds for homepage to load, whereas the homepage loads in 4 secs when accessed via webseal.
    below is the ACE details. Kindly provide the your inputs to resolve this issue. will rate all the suggestions
    Hardware Product Number: ACE20-MOD-K9
      Card Index:     207
      Hardware Rev:   2.3
      Feature Bits:   0000 0002
      Slot No. :      7
      Type:           ACE
    Software
      loader:    Version 12.2[120]
      system:    Version A2(1.4) [build 3.0(0)A2(1.4) adbuild_11:54:12-2009/03/05_/a
    uto/adbu-rel2/rel_a2_1_4_throttle/REL_3_0_0_A2_1_4]
      system image file: [LCP] disk0:c6ace-t1k9-mz.A2_1_4.bin
      installed license: ACE-SEC-LIC-K9

    Dear all,
    Please suggest on this issue.
    BS

  • Site not accessible from the Load balanced web front end server - sharepoint 2010

    I have a production environment with 2 WFE's(sp-wfe1 & sp-wfe2), 2 APP's and 2 SQL clustered VM's.
    2 WFE's are load balanced using hardware load balancer.
    An A-Record(PORTAL) is created in DNS for the virtual IP of the load balancer which points to the 2 WFE's.
    A web application is created on the WFE's on port 80.
    alternative access mapping is configured and the load balanced record "http://PORTAL" is used under the default zone.
    Under IIS I have edited the bindings for the sharepoint site at port 80 and added the HOSTNAME as PORTAL.
    Result: The site is accessible from outside the server and works fine.
    ISSUE: The site is not accessible within the WFE's(sp-wfe1 & sp-wfe2).
    When I browse the site from the WFE's server it ask for the credentials and when I enter the credentials and click OK it ask the credentials again and again and in the end displays a blank page.
    Kindly help me in this issue because I am clueless and couldn't find anything helpful on the internet. 
    Regards,
    Mudassar
    MADDY-DEV Forum answers from Microsoft Forum

    Loop back check.
    http://www.harbar.net/archive/2009/07/02/disableloopbackcheck-amp-sharepoint-what-every-admin-and-developer-should-know.aspx

  • Upload of historical data from legacy system

    Dear forum
    We are running standard SOP with transaction MC88 in ECC6.0.
    We are doing forecasting on material/plant level, based on historical data (we create the sales plan from transaction MC88).
    Now, the problem is that we are introducing new products into SOP, but there is no historical data in SAP for those materials.
    Is there any way we can import historical data into SAP from legacy system, so that SOP would take this data into account when calculating the forecast?
    Note: we do not want to build anything in flexible planning. Just want to check whether or not it is possible to import historical values from a legacy system, to use as historical data in SAP.
    Thanks in advance
    Lars

    Thanks,
    But that will not help us.
    Any ohter suggestions out there?

  • Unable to read data from Analog Devices 6b11 module - error code 1240

    Hi everyone,
    I'm trying to read data from a thermocouple with an AD 6B11 module in an AD 6BP16-1 backplane using RS232 serial. I've been following this guide:
    http://digital.ni.com/public.nsf/allkb/8C77E5E52B4A27968625611600559421
    Everything seems to work out well until step 13, where I call the "AD6B Input Module - Read Data.VI". I get an error code 1240 stating something like "invalid parameter or unable to read instrument".
    Has anyone got any experience with this error or suggestions to fixing it?
    Thanks in advance!

    That user guide is 20+ years old. The serial functions from that day used 0 for com 1, 1 for com 2, etc. Make sure you have the correct one selected. Please provide the exact error message instead of 'something like'.

  • Extracting data from Essbase & loading into flat file through ODI

    Hi,
    I want to extract data from Essbase and load it into a flat file through ODI(for extraction from essbase I'm using a report script) and I’m using these KM’s:- LKM Hyperion Essbase data to SQL,IKM SQL to FILE Append & for reversing I’m using RKM Hyperion Essbase.All the mappings have been done and the interface has been made. But when I’m executing the interface it is throwing the error below:-
    ODI-1217: Session ESS_FILEI (114001) fails with return code 7000.
    ODI-1226: Step ESS_FILEI fails after 1 attempt(s).
    ODI-1240: Flow ESS_FILEI fails while performing a Loading operation. This flow loads target table ESS_FILE.
    ODI-1228: Task SrcSet0 (Loading) fails on the target FILE connection FILE_PS_ODI.
    Caused By: java.sql.SQLException: ODI-40417: An IOException was caught while creating the file saying The system cannot find the path specified
    at com.sunopsis.jdbc.driver.file.impl.commands.CommandCreateTable.execute(CommandCreateTable.java:62)
    at com.sunopsis.jdbc.driver.file.CommandExecutor.executeCommand(CommandExecutor.java:33)
    at com.sunopsis.jdbc.driver.file.FilePreparedStatement.execute(FilePreparedStatement.java:178)
    at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)
    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
    at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
    at java.lang.Thread.run(Thread.java:619)
    Please let me know what I'm missing and how I can resolve this error.
    Thanks

    It seems that you are trying to use the file as your staging areas. Hyperion LKM extracts essbase data into a DB staging area which can then be used by your file IKM to load it into file.
    You need to use a RDBMS for your staging area.

  • Clearing all historical data from the graphs

    Hi all,
    Now that I have upgraded to 4.1.1c I want to clear all historical data and start from scratch with the graphs/stats/reporting. Is there a way to do such a thing? Could not find anything in the docs other than "clear statistics all" which seems unrelated.
    Thanks
    Adrian

    http://www.cisco.com/en/US/docs/app_ntwk_services/waas/waas/v411/configuration/guide/monitor.html#wp1043484
    Enclosed is the report montioring and configuring guide.
    The clear statistics command clears all statistical counters from the parameters given. Use this command to monitor fresh statistical data for some or all features without losing cached objects or configurations.

  • Retrieving historical data from new ST04

    In the old ST04, you could get a nice, 3 month daily overview of key measures just by hitting "Previous Days".  I use that in my performance analyses.  With the new ST04, I have no idea how that's done.  From my understanding, the new ST04 should give you historical data if you give an initial snapshot date that's far enough back.  But I see no way to define the initial date as anything but "Database Start".  SAP_COLLECTOR_FOR_PERFMONITOR has been running consistently for months. Programs RSORAHCL and RSORAVSH are scheduled to run hourly every day.  So the history should still be there, and should be accessible.  However, the documentation on the new DBACOCKPIT is very sketchy, as far as I've seen.
    Can anyone either point me to some good documentation on this topic, or provide some hints.  I'd very much appreciate it.
    Thanks very much.
                                                       Gordon

    Stefan,
    The system I'm looking at, L6P, is on Basis 7.00 SP 13.  For troubleshooting, I compared L6P to our G8P system , which is on Basis 7.00 SP 15.
    One thing I found is that I can select some dates under the "Database Start" and "Up To Now" buttons on G8P, but not on L6P.  I further found that in table TCOLL, RSORAHCL has all 7 days marked in G8P, but no days marked in L6P.  That would explain why I see the dates in G8P, but not in L6P -- I need to flag the days for RSORAHCL in TCOLL.
    So now, I know what I need to do to pick dates going back as far as what's in AWR, according to dba_hist_snapshot.  I also see how I can change the snapshot interval and retention periods:
    begin
       dbms_workload_repository.modify_snapshot_settings (
          interval => 20,
          retention => 22460
    end;
    for example.  The data's in minutes, so I'd want interval = 1440 and retention = 902460 for 3 months of daily snapshot data.
    However, I still don't see how I can actually see the history.  I go into Statistical Information --> System Summary Metrics, select Metrics Datasource dba-view, and put in the dates I want.  I get a lot of metrics, but I don't see a way to limit them to the sepcific  ones I want (for data buffer hit rate, I think I'll need to get the physical and logical reads).  How can I clean this up to show only what I need?
    Thanks very much.
                                                                    Gordon

Maybe you are looking for

  • Cannot search mail in OWA

    I have a Windows 7 Pro PC and I'm using IE 11. I can search my mail on my laptop in Outlook, but I can't on my PC via OWA. Thanks

  • /dev/sequencer does not exist --- Found a fix for this !!!!!

    Hi guys i used to be able to play midi files and load sf2 into my audigy 2 platinum card. But last week i upgraded my system which was a 2.4 kernel and since then i get /dev/sequencer no such device. I bit the bullet and upgraded to the 2.6-scsi kern

  • Draft/Full Screen view

    Still no way to see a page in full screen or wrap to screen mode, huh? Even TextEdit has this feature. Can it be done in Pages 08?

  • How to uninstall Viber app on MBA 10.9.4

    I want to uninstall an app called Viber (a communication app).  It was downloaded to my MBA but never activated.  By that I mean it was not set up to use contacts and whatever else it was intended to do. The developer website offers no clear advice a

  • Please help with PPV Purchase.

    I moved into a new home and transferred my Verizon FiOS from the old house to new house.  Becuase I moved counties, I had to send back my old boxes and get new boxes for the new county. Fast forward..... I tried to buy a UFC PPV event like within the