WCS 7.0.240.0 data export to Prime 1.2.1.012 ... HELP!!

Hello Fellow Engineers,
I'm trying to migrate old data from WCS 7.0 to Prime 1.2 ... I have already created the zip file from WCS and imported it into the defaultRepo on Prime.  I see it in the directory when I do a show repository defaultRepo so I have confirmed that it is there.  My issue is that it appears Prime 1.2.1.012 won't accept the cli command of ncs migrate. 
So, can someone let me know how I can proceed with the migration?  I haven't been able to find any similar command in the Prime CLI so at this point I'm lost as to where I should go from here.  I really don't want to have to tell the customer they have to downgrade to Prime 1.1 in order for them to perserve their maps especially since the Deployment Guide for Prime Infrastructure states that it is possible to migrate date from WCS 7.0 to Prime 1.2 as shown below ...
Data Migration
Data can be migrated from WCS 7.0, NCS 1.1, or LMS 4.x. More details on migrating data from each of these applications are spelled out in the following sections.
From WCS
You must upgrade the Cisco WCS server to one of the following releases before you attempt to perform the migration process to Cisco Prime Infrastructure 1.2:
• 7.0.164.3
• 7.0.172.0
• 7.0.220.0
This section provides instructions on how to migrate the WCS on either a Windows or Linux server to Cisco Prime Infrastructure.
Exporting Data from WCS
Export data from WCS 7.x through the CLI. The export userdata CLI command is available in WCS Release 7.x and later, which creates the ZIP file that contains the WCS data file. The CLI does not provide any option to customize what can be exported; all nonglobal user-defined items are exported. Complete these steps in order to export WCS data:
1. Stop the WCS server.
2. Run the export command through the script file and provide the path and export filename when prompted.
3. For Linux, run the export.sh all/data/wcs.zip command.
For Windows, run the export.bat all \data\wcs.zip command.
Importing Data into Cisco Prime Infrastructure
Complete the following steps to migrate data from WCS:
1. Place the WCS export ZIP file (for example, wcs.zip) in a repository or folder (for example, repositories).
2. Log in as the admin user and stop the Cisco Prime Infrastructure server by entering the ncs stop command. Configure the FTP repository on the Cisco Prime Infrastructure appliance using the repository command as shown in configuration snippet below:
pi-appliance/admin# configure
pi-appliance/admin(config)# repository pi-ftp-repo
pi-appliance/admin(config-Repository)# url ftp://209.165.200.227/backup
pi-appliance/admin(config-Repository)# user ftp-user password plain ftp-user
Note: Make sure the archived file is available with the show repository <repositoryname> command.
3. Enter the ncs migrate command in order to restore the WCS database.
pi-appliance/admin# ncs migrate wcs-data wcs.zip repository pi-ftp-repo
4. By default, no WCS events are migrated. Enter the ncs start command in order to start the Cisco Prime Infrastructure server after the upgrade is completed. Log in to the Cisco Prime Infrastructure user interface with the root login and the root password.
Any HELP on this would be greatly appreciated!!! 
Thanks in advance!!!!

Scott,
I don't have a FTP client ... I typically use TFTPd for any upgrades I have to do.  I thought I could use this to transfer the patch and export.zip to NCS?  However, when I tried to run the patch install it gives me this error ...
% Manifest file not found in the bundle
Can you tell me if you have experienced this before?  Also if it's a problem with me using TFTPd can you recommend a FTP client application that I should use.
Thanks!

Similar Messages

  • Data Export From InDesign CS6 to Excel Please Help

    Hi!
    I need to export text from a 416 page catalog into an excel document. The text is formatted with paragraph and character style sheets, but the text frames do not link together. Or is there a way to export the text from InDesign with the page number into a text file, then the text file into an Excel document. Thank you so much for your help!

    That is not possible. You're going from a "smart" layout application to a dumb one.
    I don't think you could even place it as a PDF file of the text into Word and have it work.

  • Essbase Data Export not Overwriting existing data file

    We have an ODI interface in our environment which is used to export the data from Essbase apps to text files using Data export calc scripts and then we load those text files in a relational database. Laetely we are seeing some issue where the Data Export calc script is not overwriting the file and is just appending the new data to the existing file.
    The OverWriteFile option is set to ON.
    SET DATAEXPORTOPTIONS {
         DataExportLevel "Level0";
         DataExportOverWriteFile ON;     
    DataExportDimHeader ON;
         DataExportColHeader "Period";
         DataExportDynamicCalc ON;
    The "Scenario" variable is a substitution variable which is set during the runtime. We are trying to extract "Budget" but the calc script is not clearing the "Actual" scenario from the text file which was the scenario that was extracted earlier. Its like after the execution of the calc script, the file contains both "Actual" and "Budget" data. We are not able to find the root cause as in why this might be happening and why OVERWRITEFILE command is not being taken into account by the data export calc script.
    We have also deleted the text data file to make sure there are no temporary files on the server or anything. But when we ran the data export directly from Essbase again, then again the file contained both "Actual" as well as "Budget" data which really strange. We have never encountered an issue like this before.
    Any suggestions regarding this issue?

    Did some more testing and pretty much zoomed on the issue. Our Scenario is actually something like this "Q1FCST-Budget", "Q2FCST-Budget" etc
    This is the reason why we need to use a member function because Calc Script reads "&ODI_SCENARIO" (which is set to Q2FCST-Budget) as a number and gives an error. To convert this value to a string we are using @member function. And, this seems the root cause of the issue. The ODI_Scenario variable is set to "Q2FCST-Budget", but when we run the script with this calculation function @member("&ODI_SCENARIO"), the data file brings back the values for "Q1FCST-Budget" out of nowhere in addition to "Q2FCST-Budget" data which we are trying to extract.
    Successful Test Case 1:
    1) Put Scenario "Q2FCST-Budget" in hard coded letters in Script and ran the script
    e.g "Q2FCST-Phased"
    2) Ran the Script
    3) Result Ok.Script overwrote the file with Q2FCST-Budget data
    Successful Case 2:
    1) Put scenario in @member function
    e.g. @member("Q2FCST-Budget")
    2) Results again ok
    Failed Case:
    1) Deleted the file
    2) Put scenario in a substitution variable and used the member function "@member("&ODI_Scenario") and Ran the script . *ODI_SCENARIO is set to Q@FCST-Budget in Essbase variables.
    e.g. @member("&ODI_SCENARIO")
    3) Result : Script contained both "Q1FCST-Budget" as well as "Q2FCST-Budget" data values in the text file.
    We are still not close to the root cause and why is this issue happening. Putting the sub var in the member function changes the complete picture and gives us inaccurate results.
    Any clues anyone?

  • Automatise data export from website

    Hello,
    I stuck with an issue regarding data export from scopus.com. I have 61 keywords for which I want to export (as csv) number of documents by year, country, document type, subject area. Doing that by hand would be extremely tedious. Thus my question if this can be automatised. Unfortunately scopus does not provide an API to access the data directly.
    So, the process to be automatised would be:
    1. Open http://www.scopus.com/home.url
    2. Enter the keyword in "search for"
    3. Click on "search"
    4. Click on "Analyze results"
    5. Click on tab "year" and "export". Save the file
    6. Click on tab "country" and "export". Save the file
    6. Click on tab "document type" and "export". Save the file
    6. Click on "subject area" and "export". Save the file
    Do programs exist that help me retrieving the data?
    Thanks!

    You could achieve your goal with bash and wget with some combination of --save-cookies --post-data.
    But be aware that scopus is run by Elsevier, whom many view as the most evil of the publishing companies. Think twice before doing something that may annoy them.

  • Unable to stop incorrect date exports

    How do we set up a form in Adobe Acrobat XI that allows dates to be formatted a certain way (mmm/dd/yyyy) and exported in the same way to Excel and always be recognized as a "proper" date in Excel?
    Currently the following does not work (Attempt #1):
    Set up a field; Set the format category as date; Set up a Custom format of "mmm/dd/yyyy"
    Create a distribution file
    When users fill out the form if they type in an incorrect date, eg., "August 27 2013", the form automagically shows the date on the PDF as "Aug/27/13" - Great!
    When the users submit the form and it's brought into the response file the dates are shown in a default date format of mm/dd/yyyy - Fine, once the form owners understand this
    When the form owners export the information the data exported is the same as the original users entered it, not as it was automagically formatted to. For instance, if submitters originally entered "August 27, 2013" then that's what goes across to Excel. And some of these formats Excel doesn't know how to convert. - Understandably frustrating for form owners
    Attempt #2: As a workaround we set up special formatting that has a mask of "AAA/99/9999". This at least provides forces the users to use the same formatting, but is confusing the submitters when they need to enter dates from 1-9 and we've also found that the conversion of this format to a date in Excel doesn't work, but at least it's consistent! Javascript was also added to force users to use specific month abbreviations.
    d = new Date(event.value.replace(/-/g, " "));
    if (!Date.parse(d)){app.alert("Please enter in a proper date.\n\nThe month abbreviations are: Jan, Feb, Mar, Apr, May, Jun, Jul, Aug, Sep, Oct, Nov, Dec");}
    Attempt #3: The last attempt was to continue to use the set up as Attempt #1, but to also use the javascript from Attempt #2. The theory being that if a user entered in "August 27 2013" the javascript would complain. Alas, the javascript appears to run after Adobe automagically does its date format conversion.
    Does anyone know how to get around this or have any other ideas to either enforce a usable date format or have Adobe export the dates as they've been automatically formatted to? We've tried to find a way to turn off the automatic date conversion that Adobe's running, but haven't found a way yet. Another option seemed to be to allow a masking that allowed for optional characters (so that the "0" wouldn't be needed for the dates 1-9) but there doesn't seem to be one.
    Thanks in advance!

    Since there was no clear way to ensure that the date formatting was correct prior to exporting, we're going to get the respondants to use drop downs to ensure the formatting is correct. Not the most convenient for the users though as they're accustomed to being able to type in the values to select it (e.g., for the date of 23 they would expect to enter 2 then 3 for 23) based on other applications, but the Adobe pull downs don't "group" what's been entered (e.g., 2 then 3 will select 30, not 23) and so it will take them a bit to get used to it. I still can't believe that Adobe wouldn't simply export what it's been formatted to though... after all that's what we set the form up for.

  • Data Export error when migrating cucm cluster on version 7.1.5 to cucm 10.0

    Hi
    Has anyone come across below? If so any suggestions for workaround?
     01, 2014 11:54 PDT
       STATUS
        The task has been scheduled.
    Oct 01, 2014 11:54 PDT
         INFO
         Export task action ID #154 with 1 node(s) scheduled.
    Oct 01, 2014 11:54 PDT
      STATUS
               The task has started.
    Oct 01, 2014 11:54 PDT
        INFO 
        Export task action ID #154 with 1 node(s) started.
    Oct 01, 2014 11:54 PDT
       INFO
        Export job for node xx.xx.xx started.
    Oct 01, 2014 12:09 PDT
       ERROR
       Data export failed for node xx.xx.xx.
    Oct 01, 2014 12:09 PDT
      ERROR
        Export job for node xx.xx.xx failed.
    Oct 01, 2014 12:09 PDT
       ERROR
        1 nodes(s) in Export task action ID #154 failed: xx.xx.xx
    Oct 01, 2014 12:09 PDT
       ERROR
        Task paused due to task action failures.

    Hi,
    you can login PCD through putty for seeing the logs
    file list activelog tomcat/logs/ucmap/log4j/ detail date
    further, run 
    file tail activelog tomcat/logs/ucmap/logs4j/ucmap00001.log[for example]
    regds,
    aman

  • What rights are needed to do a Data Export?

    I would like someone from our help desk to be able to do Data-->Export
    in Console1 to export inventory data regularly. When they try this,
    they immediately get the following error:
    Data Export will not proceed. Unable to identify the type of
    installation.
    I saw TID 10088974 but I don't think it applies because if I login to
    the same PC, it works. (Running Console1 locally).
    I also saw another TID indicating user needs Browse rights to the
    ZEN_invDatabase object. [Public] does have rights to this.
    What rights do I need to grant?
    -Marc Johnson

    On Wed, 15 Jun 2005 22:22:40 GMT, Marc Johnson wrote:
    > What rights do I need to grant?
    they will need the read right to the properties of the database object...
    Marcus Breiden
    Please change -- to - to mail me.
    The content of this mail is my private and personal opinion.
    http://www.edu-magic.net

  • BPC10 - Data manager package for dimension  data export and import

    Dear BPC Expers,
    Need your help.
    I am trying to set up a data manager package for first time to export dimension - master data from one application and import in another application ( both have same properties) .
    I created a test data manager package from Organize > add package > with  process chain /CPMB/EXPORT_MD_TO_FILE  and Add
    In the advance tab of each task there are some script logic already populated. please find attached the details of the script logic written under each of the tasks like MD_Source, concvert and target .
    I have not done any chnages in the script inside the task .
    But when i run the package , I have selected a dimension 'Entity' but in second prompt ,it ask for a transformation file , and syatem autometically add the file ... \ROOT\WEBFOLDERS\COLPAL\FINANCE\DATAMANAGER\TRANSFORMATIONFILES\Import.xls
    I have not changed anything there
    in the next prmpt , it ask for a output file ..and it won't allow me enter the file name .....i
    Not sure how to proceed further.
    I shall be greatfull if someone guide me from your experiance  how to set up a simple the data manager package for master data export from dimension . Should I update the transformation file in the script for import file and  output file in the advance tab. how and what  transformation file to be created and link to the data manager package for export / import .
    What are the steps to be executed to run the package for exporting master data from dimension and import it another application .
    Thanks in advance for your guidance.
    Thanks and Regards,
    Ramanuj
    =====================================================================================================
    Detals of the task
    Task : APPL_MD-SOURCE
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    Task : EXPORT_MD_CONVERT
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    Task : FILE_TARGET
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    ================================================================================

    1. Perhaps you want to consider a system copy to a "virtual system" for UAT?
    2. Changes in QAS (as with PROD as well) will give you the delta. They should ideally be clean... You need to check the source system.
    Another option is to generate the profiles in the target system. But for that your config has to be sqeaky clean and in sync, including very well maintained and sync'ed Su24 data.
    Cheers,
    Julius

  • Data Export to Oracle table from essbase issue

    Hello,
    i am using a data export calc script to load data from essbase into an oracle table. We have Essbase 11.1.2.1 on windows 2008 64 bit R2 server. I have an ODBC system DSN created for this job.
    However, when i launch this process i get a message in the log "Cannot read SQL driver name for [Backup Exec Catalogs] from [ODBC.INI]"
    i have checked the ODBC.ini file also in C:\Windows directory and that file contains the connection entries...
    Any thoughts as to why i am getting this error in one server whereas the same process on SIT server is working fine.
    thanks...

    please restart the application and try again . Check the application log if any details are there

  • HR Master Data Export error

    Hi all,
    I'm working at a client where we upgraded from 4.6 to 6.0. This client has never used the master data export program.
    I'm trying to configure the master data IDOCs, but I'm having an error when running the program RPCEMDU0_CALL. It refers  to an OSS note back in 2001, so that really does not apply to this installation. The problem seems to be with function module HR_PU12_UPDATE_T532K_T532L on IT0106. Some sort of data dictionary discrepancies. I haven't been successful in finding any newer notes related to this.
    Has anyone experienced this problem before?
    Thanks in advance for your response.
    Cesar

    Hi Bernd,
    Thanks for your response. I looked at that OSS note but it doesn't seem to be realted to the problem I'm having.
    The error reads:
    "Export program terminated   A009
    Export program obsolete, regeneration required
    Do not regenerate immediately: First make a  $STRUC_P0106 back up of the include
    PERNR N 000008 000000
    (See note 375108) This include is very important if old cluster IF records"
    I've re-activated the program and interface formats, but still getting the same error.
    Thanks again,
    Cesar

  • Problem in data export

    Hi All,
    I have a problem regarding data export, when I used EXP command in my oracle form then EXPORT data from database then the dmp is larger than the dmp when I use to export data from same database using DBMS_DATAPUMP. For the case of EXP dmp size is say 40,225 KB and for the same database when I used DBMS_DATAPUMP it becomes say 36,125 KB. Why this difference is occur? Is this a problem? What will be the solution?
    Please Help ASAP.
    Thanks in advance.
    Regards
    Sanjit Kr. Mahato
    Edited by: userSanjit on Jul 23, 2009 6:19 PM
    Edited by: userSanjit on Jul 23, 2009 6:24 PM

    Hi,
    Expdb and Exp are different exporting utility of oracle and hence the output file sizes are not same, and so difference occurs.
    No this is not a problem
    Since this is not a problem and hence no solution.
    Why you see this as a problem
    Cheers
    Anurag

  • Data export BR execution Time

    Hi all,
    data export Br is taking more than 13 hours for its execution and after its execution i got an exported file which has only 7000 rows.
    As we have data only at 4 members out of 783 mbrs of a perticular dimension but BR is running for all members.My question is if the data size increase i.e if we get data at all 783 members then time execution of the same BR will be same i.e 13 hours because we are executing BR for the same number of combinations or will it take more than 13 hours.
    Thanks.

    Hi all,
    data export Br is taking more than 13 hours for its execution and after its execution i got an exported file which has only 7000 rows.
    As we have data only at 4 members out of 783 mbrs of a perticular dimension but BR is running for all members.My question is if the data size increase i.e if we get data at all 783 members then time execution of the same BR will be same i.e 13 hours because we are executing BR for the same number of combinations or will it take more than 13 hours.
    Thanks.

  • XML Multiple Data Export/Import into New Form

    Is it possible to import an XML file with multiple form data (exported via "Merge data files into spreadsheet" option, then saved as XML) into a "template" form and create individual forms from the multiple data sheet? In other words, I've merged 65 forms' data into an XML file. Now I'd like to import it all back into an updated form.
    What I've been doing now is exporting the XML data individually for each form and importing each form individually into the new form.
    One option is to extend rights to the user to import nd export themselves, but I'm still looking into the Formrouter service, which, if implemented, won't be for a while.
    Any solutions to this painful process?
    Thanks - Derek
    I just realized this may be a question for a different forum...Acrobat... My apologies.

    Hi Derek,
    Without the LC Enterprise server product(s) I don't think you will be able to achieve this. Acrobat.com give a mechanism of distributing the form, I am fairly sure it will allow you to view the responses in a new form.
    Also applying reader extensions to the form with Acrobat will not help, as this removes the ability of importing/exporting XML. See https://acrobat.com/#d=3lGJZAZuOmk8h86HCWyJKg. If you are extending rights with LC Reader Extensions ES then this restriction should not apply.
    If you have the 65 XML responses, I would be inclined to bite the bullet and manually import the XML into the new form.
    Good luck,
    Niall

  • Data export(ttbulkcp) Oracle TimesTen Question

    I'm trying to export an Oracle TimesTen(TimesTen Release 11.2.1.5.0) with ttbulkcp(Data export), using SQL Developer Version 2.1.1.64
    All other functions are normal operation. Also ttbulkcp(Data export) in Oracle Table is normal operation. But , I get the following error in ttbulkcp(Data export) in TimesTen Table.
    java.lang.NullPointerException
         at oracle.dbtools.raptor.format.ResultsFormatterWrapper.getColumnCount(ResultsFormatterWrapper.java:67)
         at oracle.dbtools.raptor.format.ResultsFormatter.getColumnCount(ResultsFormatter.java:130)
         at oracle.dbtools.raptor.timesten.format.TimesTenLoaderFormatter.getColumns(TimesTenLoaderFormatter.java:207)
         at oracle.dbtools.raptor.timesten.format.TimesTenLoaderFormatter.printColumnData(TimesTenLoaderFormatter.java:183)
         at oracle.dbtools.raptor.timesten.format.TimesTenLoaderFormatter.start(TimesTenLoaderFormatter.java:73)
         at oracle.dbtools.raptor.format.ResultSetFormatterWrapper.print(ResultSetFormatterWrapper.java:150)
         at oracle.dbtools.raptor.format.ResultsFormatter.print(ResultsFormatter.java:200)
         at oracle.dbtools.raptor.format.ResultsFormatter.doPrint(ResultsFormatter.java:416)
         at oracle.dbtools.raptor.dialogs.actions.TableExportAction$5.doWork(TableExportAction.java:637)
         at oracle.dbtools.raptor.dialogs.actions.TableExportAction$5.doWork(TableExportAction.java:634)
         at oracle.dbtools.raptor.backgroundTask.RaptorTask.call(RaptorTask.java:193)
         at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
         at oracle.dbtools.raptor.backgroundTask.RaptorTaskManager$RaptorFutureTask.run(RaptorTaskManager.java:492)
         at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
         at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
         at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
         at java.lang.Thread.run(Thread.java:619)
    Driver not capable
    Thank you.
    GooGyum

    If you have a DB support contract, I suggest you open a SR on Metalink/MOS, to get official response and follow-up, else you can hope someone from development picks it up here.
    Regards,
    K.

  • Data Exportation

    I am stumped on a Data Exportation issue. I thought I would post the issue on the forum to see if anyone can help me. We are using Planning 11.1.1.3.
    In essbase we had a calc script written to export data to a txt file. When I ran the calc script recently I noticed a department (dept 101) was not being included in the export file. Here is what has me stumped:
    In the Department dimension, dept 101 sit in the middle of dept 100 and dept 102. The export file pulls in dept 100 and dept 102. I have to think that the CS is wrritten correctly. There is nothing in the CS that singles out dept 101. The CS is written very general.
    When I look at the departments in Planning I see nothing different in the set up for dept 101 when compared to the set up of departments 100 and 102.I realize its probably hard to diagnose the issue without viewing the dimensions, export file, CS, etc. However, any comments/suggestions would be greatly appreciated. Thx!

    When I look at the departments in Planning I see nothing different in the set up for dept 101 when compared to the set up of departments 100 and 102.
    ---- Did you check the storage properities and looked over the possiblity of implied sharing ......(Just an area to watch out for .....Sounds simliar to such case ..But ya Dont have the outline so cant really tell..Just a genrall point..)
    -VD

Maybe you are looking for

  • File Dialog Box often takes a long time to execute

    Hi all The Labview File Dialog Box Express VI sometimes takes 5 to 10 seconds to execute. On other occasions, it executes in a blink of an eye. Why is that so? I have tried to use the older Open/Create/Replace File VI as suggested in one of the forum

  • SOAP Adapter in PCK

    Hi, I am trying to get the SOAP Adapter in the PCK running, but always get an error while sending a SOAP message to the adapter. I configured the send adapter as described under help.sap.com and when I try to check the availability of the channel by

  • JMS Bridge not Working

    Hi, I have a weblogic 9.2 running one application. I what that application to post messages on a internal JMS queue. That message gets fw to a Weblogic 8 JMS queue. To achieve this I created a JMS Bridge with the Weblogic 8 destination, and the Weblo

  • To find error message in the standard program sapmv45a

    Hi friends, I have a following senario. Iam getting the following error mesage. INFO RECORD FOR VENDOR 'ABC' AND MATERIAL '00000678' DOES NOT EXIST. now my requirment is i should find from were this error message is being triggered. actualy my proble

  • Alternative VDSL Modem/Router for Infinity

    Hi, I'm using BT Infinity and am thinking about replacing the Home Hub and VDSL modem with an alternative. Ideally I want a combined unit so it will probably be a FRITZ!Box Fon WLAN 7390 as there isn't a vast amount of choice at the moment (hurry up