Export Data for Prime Infrastructure

I started an Export for Prime Infrastructure on my LMS 4.2.3 box several days ago and it is still running.  There are approximately 3600 devices being exported from LMS I am guessing it should not take this long.  Would like to kill the process but I am not sure which one to stop?

That would be lms he'd need to restart Afroz.
Depending on the platform:
Net stop crmdmgtd then net start crmdmgtd (Windows server)
Application stop LMS / application start LMS ( soft or physical appliance)
There is probably an individual process you could restart but I don't have it at my fingertips. I haven't exported one that big but usually it would only be a matter of minutes to complete. It's just a simple csv file.
Sent from Cisco Technical Support iPad App

Similar Messages

  • Exporting Data for a single set of books from Oracle Financials

    I have a multiorg set-up with multiple set of books in an Oracle instance. I want to be able to export data for a single set of books into a new Oracle instance.
    Any insight into how this can be done (without going through each of the tables and referential integrity constraints) will be of great help
    thanks
    anoop

    It is a bit unclear what you are sending these POs to the other system for? Is it for matching to invoices there? Is it for receiving into inventory there?
    Are these two entities belong to the same parent but on two different oracle apps databases?
    In essence what do they do with the PO that you are sending to them?
    Why do you think there will be intercompany invoicing?
    Thanks
    Nagamohan

  • Best way to export data for a migration

    Hi Oracle Community,
    What's the best way to export data from an Oracle 8i database for it to be suitable for import into an Oracle 10g database?
    What's the best way to export data if it is to go into different rdbms database?
    Thanks, David

    Thanks everyone for all your help. You guys are great.
    There seems to be many good ways to export your data from Oracle into a flat file format, suitable for import into other RDBS': Oracle, mysql, postgresql, etc.
    A few tools where mentioned but using SQL*Plus, which comes with Oracle (And SQL*LDR on the backend, which also comes with Oracle) seem the most straight forward.
    I found this script on asktom.oracle.com to work great, slightly modified here,
    (to Include linesize max, and pipes rather than commas):
    set echo off newpage 0 space 0 pagesize 0 feed off head off trimspool on
    set linesize 32767
    spool payment.txt
    select
    PAYMENT_ID||'|'||
    USER_ID||'|'||
    <more fields her>
    from
    payment
    spool off
    exit ;
    It works great. Rather than making one of these for each table I wrote an perl script called ora_export. http://crowfly.net/oracle/ora_export. It runs in Unix and only requires SQL*PLUS. It creates these four files:
    <tablename>.def # list of table columns and types (SQL*Plus DESC)
    <tablename>_dump.sql  # a script to export the data
    <tablename>.psv # THE DATA (eq. - name|address|etc)
    <tablename>_load.ctl  # SQL*LDR control for ORCL if you need it.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Export data for specific period through Data Pump

    Hi,
    I've a specific requirement to take the dumps of some tables for specific time period. like between last 10 days like 01-JAN-11 to 10-JAN-11. How can I acommplish this. For Documentation what I read is that we can export the data for specific period of thie by either setting FLASHBACK_SCN or FLASHBACK_TIME parameter in expdp command but this is point in time export not for the specific time export.
    Please guide me how can export between the specific time. like between 1-JAN to 10-JAN
    Regards,
    Abbasi

    export between the specific time. like between 1-JAN to 10-JANYou need to clarify your requirements. Data is always "at a point in time". I can see data as at noon of 01-Jan. I can see data as at noon of 10-Jan. What would I mean by data "between" 01-Jan and 10-Jan ?
    Say the table has 5 rows on 01-Jan :
    ID    VALUES
    1      ABC
    2      DEF
    3      TRG
    4      MXY
    5     DEW2 Rows "6-GGG" and "7-FRD" were inserted on 02-Jan.
    2 Rows "2" and "3" were updated from "DEF" and "TRG" to "RTU" and "GTR" on 03-Jan.
    1 Row "5-DEW" was deleted on 09-Jan.
    2 Rows "8-TFE" and "9-DZN" were insereted on 09-Jan.
    Can you tell me what is the "data between 01-Jan and 10-Jan" ?
    (the above example actually happens to have an incrementing key column "ID". Your table might not even have such an identifier column at all !)
    Hemant K Chitale
    Edited by: Hemant K Chitale on Jan 10, 2011 5:23 PM

  • Export data for domain data make wrong file

    Hi!
    If I try export data from table with column with type such as MDSYS.SDO_GEOMETRY in SQL developer (1.2.0 and 1.2.1.3213 both) result file will be with information like (for insert clause):
    Insert into table_name (NUMB,GEOLOC) values (500949,'MDSYS.SDO_GEOMETRY');.
    Also in previous version (1.2.0) when this column was shown in data window it was more informative:
    MDSYS.SDO_GEOMETRY(2006, 262148, NULL, MDSYS.SDO_ELEM_INFO_ARRAY(1,2,1,95,2,1,109,2,1,133,2,1,157,2,1), MDSYS.SDO_ORDINATE_ARRAY(22847.57591,7216.21100000001,22842.04691,7217.2571,22841.44841,7218.00440000001,22843.39211,7228.31675000001,22844.13881,7232.35205000001,22845.63335,7239.52580000001,22845.63335,7240.27310000001,22845.03599,7240.72145000001,22826.05499,7244.15885000001,22814.39735,7246.10180000001,22809.01769,7246.84910000001,22807.67249,7246.40075000001,22802.44103,7222.33850000001,22799.19203,7213.03505000001,22795.8656525,7208.73815000001,22794.81386,7208.73200000001,22789.47752,7208.70080000001,22784.3570675,7209.03725000001,22758.6899675,7184.04095000001,22757.3447675,7183.59260000001,22751.9645375,7183.59245000001,22744.006055,7183.03205000001,22743.258785,7181.83640000001,22737.1684775,7181.35070000001,22736.7201725,7182.69575,22729.546295,7183.59245000001,22726.7066975,7186.58165000001,22725.9594275,7186.73105000001,22725.2121575,7186.43210000001,22723.11983,7184.56400000001,22722.29789,7184.48915000001,22721.55062,7186.28270000001,22721.326325,7186.80575000001,22717.515305,7191.36410000001,22715.7218,7193.68070000001,22710.1920875,7200.48080000001,22709.4448175,7206.90740000001,22709.370005,7214.15585000001,22709.74364,7214.52950000001,22711.6866275,7215.35150000001,22711.83611,7216.84610000001,22711.98545,7220.05925000001,22711.611815,7236.12560000001,22711.3876625,7247.63360000001,22711.4249975,7249.76345000001,22710.7523975,7250.95910000001,22710.0051275,7252.45355000001,22849.96763,7244.45780000001,22848.8559875,7243.04300000001,22848.32375,7242.36545000001,22849.51961,7243.41155000001,22848.8559875,7243.04300000001,22846.82921,7241.91710000001,22826.05499,7244.15885000001,22263.062285,7163.22935000001,22263.809555,7173.01865000001,22265.67773,7194.61475000001,22265.902025,7196.78180000001,22265.902025,7197.23015000001,22265.8272125,7197.37970000001,22265.304095,7197.97745000001,22217.9272625,7201.19075,22217.1799925,7201.56440000001,22216.8063575,7202.31170000001,22216.35791,7204.47875000001,22216.731545,7206.12275000001,22800.2381225,7220.28350000001,22798.3699475,7214.23070000001,22796.651255,7211.31620000001,22795.3061975,7209.82175000001,22794.9325625,7209.22385000001,22794.81386,7208.73200000001,22785.5170175,7170.21620000001,22777.3717175,7133.0768,22776.9234125,7130.76035000001,22775.727695,7125.90305000001,22774.6816025,7120.82150000001,22773.7100375,7115.81480000001,22774.53212,7109.98610000001,22774.4573075,7110.73340000001,22773.2617325,7111.70480000001,22773.1870625,7112.45210000001,22773.7100375,7115.81480000001,22773.11225,7113.87185000001,22767.95603,7108.93985000001))
    when new one:
    MDSYS.SDO_GEOMETRY
    WBR,
    Sergey

    I'm newbie here and not sure what you want exactly but.
    First of all I've created table on Oracle 10G (10.2.0.3) Enterprise ed as follow:
    CREATE TABLE tblnm
         "MI_PRINX" NUMBER(11,0),
         "GEOLOC" MDSYS.SDO_GEOMETRY,
    CONSTRAINT RP_MAP_PK PRIMARY KEY (MI_PRINX)
    INSERT INTO USER_SDO_GEOM_METADATA (TABLE_NAME, COLUMN_NAME, DIMINFO, SRID)
    VALUES ('tblnm','GEOLOC',MDSYS.SDO_DIM_ARRAY(mdsys.sdo_dim_element('X', -100000.0, 185000.0, 1.425E-5), mdsys.sdo_dim_element('Y', -100000.0, 200000.0, 1.5E-5)),262148);
    CREATE INDEX tblnm_SX ON tblnm (GEOLOC)
    INDEXTYPE IS MDSYS.SPATIAL_INDEX;
    insert into tblnm (MI_PRINX,GEOLOC) VALUES
    (1,MDSYS.SDO_GEOMETRY(2001, 262148, NULL, MDSYS.SDO_ELEM_INFO_ARRAY(1,1,1), MDSYS.SDO_ORDINATE_ARRAY(6946.74932,9604.25675000001)));
    After that I've export data from this table by SQLDeveloper:
    as insert clause result was
    -- INSERTING into TBLNM
    Insert into TBLNM (MI_PRINX,GEOLOC) values (1,'MDSYS.SDO_GEOMETRY');
    when I've try to import data (after delete) by this command i've got:
    ERROR at line 1:
    ORA-00932: inconsistent datatypes: expected MDSYS.SDO_GEOMETRY got CHAR
    for loader clause file looks like
    LOAD DATA
    INFILE *
    Truncate
    INTO TABLE "TBLNM"
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
    (MI_PRINX,
    GEOLOC)
    begindata
    "1","MDSYS.SDO_GEOMETRY"
    and so one. Result file doesn't consist data for sdo_geometry column - only name of class.

  • Host Memory Consumed at VM for prime infrastructure 2.0

    Dear All,
    I have prime infrastructure 2.0 from 12 day ago i have logg on NCS (1. Message: CPU: 85 crossed down threshold: 90) and when see host memory from prime infrastructure VM i see that Host memory consumed so i make down time and increase the memory from 24 GB to 40 GB.
    today i see th host Memory increases to 38.03 GB within 12 day I add only 30 Site map contained 45 AP or more(all site maps found on PI 568)
    please feed me this is normal behaviour or not and why host memory increase . when the host memory consumed i monitor that All NCS functionality doesnt work. Please Advice.
    Prime infrastructure 2.0
    VM 5.1.0
    Thanks

    Hi Satyaki,
    Try to reboot the WLC ,if possible and then check the status of the issue.
    Regards-
    Afroz
    **Ratings Encourages Contributors ****

  • Export data for SQL Loader

    I have a table with the following 3 columns
    Help_number Number(8,0)
    Title       Varchar(100 Byte)
    Description Varchar (100 Byte)I would like to export all the data and import it into another table in another database. Im using SQL Developer to export the data. I choose the "LOADER" option but when the data is exported, the format is wrong. Here is an example of the data is exported.
    "1","Error","Error 5343 - Input not recognised" The problem i have is that the first column is being exported in double quotes even though its of a type of NUMBER. When i try to load this using sqlldr it gets rejected because its a string.
    The other problem that i have is that SQL Developer is not exporting all the rows if a table is big. I tried to export a table with 23000 rows and it only exported the first 55 rows.
    Any help will be appreciated.

    The quotes issue I am able to replicate and have logged a bug #6732587.
    I have also logged a bug for the number of rows, however, if you click ctrl-end and then export, you'll get all the rows. Also, if you do not want to query back all the rows, but want to export all, in the Export dialog, just click the "where" clause tab and then Apply. This will also bring back all the rows. This bug is not only for Loader, but for any export format.
    Sue

  • Export data for offer/rejection letter Issue

    The functionality for generating offer letters are generating fine from the Laptops of the users however end users who are using the desktop are not able to view the data and only .rtf template is appearing.
    could any one assist me to resolve the issue pls...?

    Hi Sven,
    Thank you for your prompt response.
    I'm doing the  transport from DEV to QA initially.
    When i'm creating the transfer rules in the target system i'm getting the following message :
    Assignment of InfoObjects to DataSource
    80PROFIT_CTRH fields incomplete
    In the transfer rules it creates the following field even though it does not exist in the original datasource. Deletion of this field is obsolete.
    0SEM_CGPRCT
    Kindly need your assistance on this.
    Thank you

  • Export data for the table of SAP using  the JCO

    Good Morning,
    Sorry error of the agreement, and that I am using a translator.
    I am needing of help, to solve a problem that I have to integrate the system in Java using the SAP JCO.
    Well I need send (Export_Parameter) data into a table in SAP,but I do not know this process.
    If someone went through that and can help, thank you in advance
    Obs : Not found documents that help me
    Att
    Elton C.

    This is a forum for problems with the Java language and the default libraries that come with it. We can't help you with any third-party applications/libraries such as SAP and JCO.
    Sorry.

  • Upgrade Cisco Prime LMS 4.0 to Cisco Prime Infrastructure 2.1

    My company has asked me to migrate Prime LMS 4.0 to Prime Infrastructure 2.0 for a client and can not find the guide, the questions are? the migration is possible without an intermediate step? and There is a guide to update LMS 4.0 to PI 2.0 ??
    Thanks in advance.

    Prime LMS 4.2.2 added support for the PI template directly:
    Export Data for Prime Infrastructure
    Cisco Prime LAN Management Solution supports data migration to Prime Infrastructure 1.2. This feature enables user to export the Device Credentials Repository (DCR) data from LMS 4.2.x versions to Prime Infrastructure 1.2. The data backup status and backup location are displayed at the bottom of the Export Data for Prime Infrastructure page. The data being exported from LMS and the data imported back to PI are listed as follows.
    Data exported from LMS
    •Device Credentials Repository (DCR)
    •Software Image Management
    •Config Archive
    •Change Audit
    •Syslog
    Data Imported to PI
    Device Credentials Repository

  • Prime Infrastructure 2.2 release date

    Hi
    I would like to know the release date of Prime Infrastructure 2.2.
    Its mentioned that it's needed to get lightweight 1700 series AP to Prime
    http://www.cisco.com/c/en/us/td/docs/wireless/compatibility/matrix/compatibility-matrix.html#31079

    Middle of the year... Or so:)
    v7.4.121.0 is a good stable code. If you need to support the 3700's or need features, then v7.6 is your only choice.
    Sent from Cisco Technical Support iPhone App

  • Prime Infrastructure 2.0: Open Database Schema Support available?

    Hello,
    I searched the forum for an Open Database Schema Support for PI.
    I only found Open Database Schemas for the Cisco Works LAN Managemet Solutions and Cisco Prime LAN Managemet Solutions.
    Exists an open Database Schema Support for Prime Infrastructure 2.0 as it exists for the older LAN Managemet Solutions ?
    Thanks.
    Bastian

    Programmatic interface to Prime Infrastructure is done via the the REST API vs. any open database schema.
    There are published reference guides here. For the most up to date information, please see your PI server itself. In Lifecycle view, the help menu has a link to the server-based API information:

  • Migrate from Prime LMS to Prime Infrastructure

    I'm currently running CiscoWorks LMS 4.0.1 on Windows 2003 under VMware and just got upgrade licensing for Prime Infrastructure 1.2.  I am assuming that I will need to upgrade the current server to Prime LMS 4.2 in order to ensure that data migration to Prime Infrastructure goes well.  I am planning to follow Cisco's recommendation to run Prime LMS and Prime Infrastructure in parallel for a time and migrate individual functions.
    My real question is about Syslog handling.  All of the managed devices are currently sending Syslog data to LMS.  As a last step in the migration, is it possible to change the IP address of the Prime Infrastructure server to replace the Prime LMS server so that the Prime Infrastructure server will just start getting all the Syslog data, or do I need to go change hundreds of managed devices to point to a new address?
    Thanks for any help you can provide.

    Putting LMS up against Prime Infrastructure is like comparing Apples to Oranges.
    LMS is a very mature product and Prime Infrastructure is in it infancy stages...
    My suggestion would be to leave the current LMS system up and running; bring a Prime Infrastructure server up with a Demo License; add a few of each type of device to your Prime Infrastructure server; setup a proof of concept scenario to prove you can do everything in Prime as you can do in LMS.

  • WCS export to NCS/Prime 1.3

    I have this situation where we need to explort WCS data to Prime infrastructure 1.2 (or 1.3)
    The problem is that the Prime 1.2 is already setup for the nework and WAN.
    Is it possible to migrate the WCS data on top of a existing database on a Prime server
    if it already has its own database. Obviously I don´t want to destroy the current database I have.

    You can not migrate data over data. If you migrate the current data the old data will be lost on the prime server.
    Rating useful replies is more useful than saying "Thank you"

  • How to extract data for particular two members of same dimension.

    As per the requirement i need to export data for certain members of a dimension. Lets say we need data for two account members A and B which is in in Account dimension only but is not a direct children. I need the data for all the available years too. Please suggest me how my DATAEXPORT command should look like.
    When i am using an AND statement it is not working accordingly. Say i am fixing for years 2007 and 2009 but the output file is coming for 2009 and 2010.
    Something other is happening when i am fixing OPEX_31 and OPEX_32. The values are coming not only for OPEX_31 and OPEX_32 but for many more accounts too.
    Here is my dataexport statement for your reference
    SET DATAEXPORTOPTIONS
    DataExportLevel "ALL";
    DataExportColFormat ON;
    DataExportDimHeader ON;
    DataExportOverwriteFile ON;
    FIX("LC","Total_Year","ESB1","2009","SIERRA","COSTCENTER_NA","CELLULAR_NA","OPEX_31",
    "January","February","March","April","May","June","July","August","September","October","November","December");
    DATAEXPORT "File" "     " "D:\exports\feb.txt";
    ENDFIX;
    I need data for OPEX_31 and OPEX_32 for all the available years starting from 2001 to 2025.
    Please suggest what are the modification needed to get the desired result.
    Thanks in advance

    Hi,
    There a few different options you can use for fixing on the months, years..
    e.g. FIX(January:December)
    or FIX(@CHILDREN(YearTotal)) < depends what the parent of the months is
    sames goes for years
    FIX(2009:2025)
    or
    FIX(@CHILDREN(Year)
    If your period dimension is dense you can always use that as the column header e.g. DataExportColHeader "Period" and then fix on the accounts you require.
    Cheers
    John
    http://john-goodwin.blogspot.com/

Maybe you are looking for

  • Parent Child Relationship in the table to be queried.

    Hi guys, I have the below query. Table 1: Column_name Reference Value1 Value1 Value2 Value1 Table 2: Column_name Attributes(child Elements) Value1 Child1 Value1 Child2 Now my query is to retrieve records as below using both the tables. Value1 Value1.

  • It wont UPDATE

    I have written this code, it works perfectly but it wont edit (update) the selected data. have I missed something. using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Linq

  • Saving as jpeg or pdf for yearbook page

    Please tell me I haven't wasted my time.  I've just spent two hours creating a collage for my daughter's yearbook page. The yearbook company needs the finished product as a jpeg ( preferably) but will probably accept a pdf.  How do I do this in HP Ph

  • Version Difference in SRM 4.0 and SRM 5.0

    hi guys, Can someone tell me the upgrades that has been done in SRM 5.0 from SRM 4.0 ? Regards, Sree

  • BDC or Call transaction QM01

    Hi, I need to call transaction QM01, on the first screen always select Z4, on the next always pass a deliverynumber and finally show the third screen to the user. Had it only been the forst screen I should skip a Call Transaction and skip, with param