Issue with data/deploying

What I do this I create a new user following with a new location and module.
Then I import(imp user/pass@orcl file=location full=y) my .dmp file. And everything seems to be working.
Then I import my data into OWB, I get everything I need. I check the data by (right-click data) and everything is fine, the tables are inserted with data.
Then I try to deploy the new tables, with action(create) I get 2 erros CREATE - (tablename is already in use) DROP - Cannot because of FK etc. (I guess it tried to create one table then couldnt then tried to drop it)
My second action then is (replace) which leads me to my deploy sucess and then all of my data is missing. The tables are empty..
My question is how can I deploy without losing my data..
Cheers

Typically you would import the metadata from the source location and either use that location as the data source (and so not need to redeploy), or deploy it to a separate target location.
The replace action is destructive as you've found, and effectively performs a drop table followed by create table. Hence any data in the table is lost.
If you just want the Control Center Manager to correctly display that the table is deployed, try setting the action to "Upgrade". This will try to upgrade the deployed object to match the definition in OWB, but as the two are identical this will result in no changes. However, it will update the deployment records to indicate that the object is deployed.
Nigel.

Similar Messages

  • Issue with Data Provider name in variable screen for BEx Analyzer

    Hello all,
    We got an issue with Data Provider name in Variable screen in BEx Analayzer.
    We want to change the DataProvider name there to Description of the report instead of its Technical name.
    Any inputs are appreciated.
    Thanks
    Kumar

    You have to create a workbook to do this.
    Refresh your query/report. In Bex analyser, there is one toolbar named BEx design toolbox, If you are not able to see it in analyser, right click on the toolbar space of BEx analyser and click on BEx design toolbox. Here, goto to design mode, by clicking on a sysbol like 'A'. after that place the curser where you want to see the Query description. and click on insert text (T) in BEx toolbox. click on it and check "Query description" in constant tab. in the general tab you need to assign a dataprovider, for that assign your query name in workbook settings (in Bex design toolbox). also check the "display caption" in general tab.
    Pravender

  • Issue with data dictionary -Table maintanance generator

    Hi all,
    I have an issue with Data dictionary, table maintenance generator. I have entered some records in a custom table (ZBCSECROLETOGRP) and changed the delivery class from C to A. When I create the table maintainance generator, I am encountered with the following errors:
    1)Field ZBCSECROLETOGRP-PORTALGROUP shortened (new visible length: 000032)
    2)0012 could not be generated
    3)In TCTRL_ZBCSECROLETOGRP field LENGTH has the invalid value 01
    My main motto is to create the table maintainace generator and transport to the furthur systems .
    Please help.
    ThnX in advance,
    Vishal..

    HI,
    Regenerate the table maintenance by selecting the checkbox of "Modified field structure" => new entry & then save.
    Also ensure that the new changes are not affecting old data bcz of data type changes. If that is the case, then delete the old records, regenerate table maint. & re-enter those records which you had deleted.
    Thanks,
    Best regards,
    Prashant

  • Issue with Date Format

    Hi All,
    I m facing an issue with Date format in the prompt. I have used date presentation variables in my column formula as shown below:
    FILTER("SKU Order Details"."Fulfilled Quantity" USING Time."Calendar Date" <= DATE '@{todate}{2900-01-01}')
    The report returns data when I don't select any date range for start & end date prompts on the page. But when I select the start & end date values in the prompt, I m getting the following error:
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 46047] Datetime value 10/22/2009 12:00:00 AM from 10/22/2009 12:00:00 AM does not match the specified format. (HY000)
    I included the following formulas for start & end date prompts:
    Start Date prompt: case when 1=2 then License."Ips Creation Date" else cast ('1.1.1900' as date) end
    End Date prompt: case when 1=2 then License."Ips Creation Date" else cast ('1.1.2900' as date) end
    Can you please help me resolve the issue.
    Thanks,
    Kartik

    Hi Nico,
    I tried putting the format that you mentioned, I m getting an error message.
    My prompts have the following formula :
    Start: case when 1=2 then License."Creation Date" else cast('1.1.1900' as DATE) end
    End: case when 1=2 then Time."Calendar Date" else cast('1.1.2900' as DATE) end
    My column formula has the following syntax:
    FILTER("SKU Order Details"."Fulfilled Quantity" USING Time."Calendar Date" between DATE '@{start}{1900-01-01}' AND DATE '@{end}{2999-01-01}')
    Error Message:
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 46047] Datetime value 11/17/2009 from 11/17/2009 does not match the specified format. (HY000)
    Can you please let me know if something needs to be changed.
    Thanks,
    Kartik

  • Issue with data source after deploying

    We are experiencing an issue with our data source after deployment of a cube. On the datasource properties in Visual Studio 2012, we have the max connections set to 0 before the deployment. Once the cube is deployed, I can navigate to the <name>.0.ds.xml
    file and open it and see that the <MaxActiveConnections>0</MaxActiveConnections> is indeed set to 0. At some point over the next couple days, a process of the cube or some other action causes that value to get updated to some number too large to
    be converted to an int, and makes the datasource invalid. At that point we cannot view the datasource properties in SSMS, we cannot open the cube project from Visual Studio, and we’ve even had failures when trying to process the cube.  Is there a config
    somewhere that would cause this value to get overwritten, or some other behind the scenes process that we can look at?
    Our server information is:
    Microsoft SQL Server 2012 (SP1) - 11.0.3153.0 (X64)
                    Jul 22 2014 15:26:36
                    Copyright (c) Microsoft Corporation
                    Enterprise Edition: Core-based Licensing (64-bit) on Windows NT 6.2 <X64> (Build 9200: ) (Hypervisor)
    Chad Dotzenrod SWC | TECHNOLOGY PARTNERS 1420 Kensington Road, Suite 110 Oak Brook, Illinois 60523-2144 http://www.swc.com

    Typically you would import the metadata from the source location and either use that location as the data source (and so not need to redeploy), or deploy it to a separate target location.
    The replace action is destructive as you've found, and effectively performs a drop table followed by create table. Hence any data in the table is lost.
    If you just want the Control Center Manager to correctly display that the table is deployed, try setting the action to "Upgrade". This will try to upgrade the deployed object to match the definition in OWB, but as the two are identical this will result in no changes. However, it will update the deployment records to indicate that the object is deployed.
    Nigel.

  • Installing the HCA Solution Accelerator - issue with SNC Deploy

    Hello Everyone -
    I am trying to install the HCA Solution Accelerator, and I am having an issue with the "ant deploy" of the Selection and Capture building block.
    I have configured the ~snc_1_0\install\ant.properties file, but I think I may have missed a setting somewhere because I am getting this build error:
    BUILD FAILED
    C:\Adobe\SolutionAccelerators\human-capital-application.1.0\sa\building_blocks\snc_1_0\ins tall\build.xml:113: The following error occurred while executing this line:
    C:\Adobe\SolutionAccelerators\human-capital-application.1.0\sa\building_blocks\snc_1_0\dep loyer\build.xml:107: The following error occurred while executing this line:
    C:\Adobe\SolutionAccelerators\human-capital-application.1.0\sa\building_blocks\snc_1_0\dep loyer\build.xml:91: Java returned: 1
    Does anyone know off-hand what would cause this particular error?
    I am relatively confident in the path information that I have set for the j2eejar, junit.jar, lcsdk, lc.install.dir as well as the database driver and connection settings.
    I am using a simple turnkey LiveCycle install on a Windows XP OS.
    Any suggestions?
    Thank you in advance,
    Ben Lyons
    *I have attached the entire Build output from the command line as a text attachment in case anyone is interested

    Well, I found a "fix" to this issue - I am just going to post it here in case anyone else encouters a similar issue.
    Apparently, it relates to some sort of error in build.xml file in the ~\snc_1_0\deployer folder.
    The particular line that was causing the build to fail was line 91 of this file, which looks like this:
    <target name="data-purge">
    Line 91 -->            <java classname="com.adobe.solutions.snc.utils.DatabaseExporter" failonerror="false" fork="true"
    classpathref="hibernate.setup.path">
                                                    <jvmarg value="-Dhibernate.bytecode.provider=javassist"/>
                                        <arg line="create-drop ${dist.dir}/hibernate-test.cfg.xml ${database.driver.class} ${database.connection.url} ${database.username} ${database.password}"/>
                                    </java>
                    </target>
    The "failonerror" attribute is by default set to "true", thus an error was occurring, so the build was set to fail.  I changed this to false so that I could build this building block, but doing so has filled me with trepidation that something isn't going to be writing to my database correctly.
    Does anyone know what a failure in this segment of this build.xml file means as far as the built SNC building block?  (Remember this is the build.xml file in the deployer folder and not the install folder).
    In other words - does this mean I should be expecting some kind of lingering error in this building block - and if so, what would that issue most likely be so that I can keep an eye out for it?
    -Ben

  • Issues with data usage Samsung S5

    Over the past few months I have been paying outrageous bills due to one of my lines eating all my data when there is no one even touching the phone. I have 3 phones and a tablet on my plan and I am paying 288.00 a month and last month they charged me 15.00 for overage usages because it went over on the last day of the billing cycle. Once again that one line ate up all the data even though it's connected to our wifi or just sitting dormant. I am getting sick of paying these outrageous prices and am about to go somewhere else. I was tricked into the 10.00 a month tablet fee when I signed two phones up under the verizon edge plan. the sales clerk did not tell me I could not cancel the 10.00 a month fee because it would be under a year contract nor did I sign anything saying that, but when I went back in to change the 3rd line over to the Edge plan they told me it would be a $300 or so disconnect charge. So of course I am still paying 10.00 a month for no reason because I have wifi in my home and I only use the tablet in my home. so i need help with trying to figure out the data issues with just one of my lines. I do not have bluetooth on or location. I have went to the play store and uninstalled everything I could. I constantly close out of my applications and clear history etc.,,

    So you thought they would just give you a free tablet at $10 a month and it didn't need to be in a contract like phones?  I took the free tablet too but common sense told me it was locked into a contract, I didn't even ask as I just knew that would be the case but I didn't care.  Mary

  • Performance issues with data warehouse loads

    We have performance issues with our data warehouse load ETL process. I have run
    analyze and dbms_stats and checked database environment. What other things can I do to optimize performance? I cannot use statspack since we are running Oracle 8i. Thanks
    Scott

    Hi,
    you should analyze the db after you have loaded the tables.
    Do you use sequences to generate PKs? Do you have a lot of indexex and/or triggers on the tables?
    If yes:
    make sure your sequence caches (alter sequence s cache 10000)
    Drop all unneeded indexes while loading and disable trigger if possible.
    How big is your Redo Log Buffer? When loading a large amount of data it may be an option to enlarge this buffer.
    Do you have more then one DBWR Process? Writing parallel can speed up things when a checkpoint is needed.
    Is it possible using a direct load? Or do you already direct load?
    Dim

  • Issue with Date showing Null in interactive report

    I created an interactive report for a customer and was confused to see blanks or more specifically dashes where there should be dates in one of the fields. I knew this field should have data so I did some testing and this is what I have found:
    The sql I am running is:
    select
    assigned_to_company,
    last_resolved_date,
    incident_id
    from
    rhpd0009_im_adherence_rpt2_vw
    When I run the command in SQL workshop I get the following results with data in the last_resolved_date field:
    [http://i83.photobucket.com/albums/j299/yogibayer/apexdateissuesqlcommand.jpg]
    I copied and pasted the SQL from SQL workshop and created a new interactive report and got the following results with no last_resolved_dates showing up:
    [http://i83.photobucket.com/albums/j299/yogibayer/apexdateissueinteractivereport.jpg]
    For some reason the order is different, but the first one INC1117629 shows up in both of them and has a last_resolved_date in SQL workshop, but not in the interactive report. Any help would be appreciated.
    Thank You
    Scott

    Varad,
    It seems to be related to the function we use to convert Remedy dates to Viewable dates. Remedy dates are stored as an integer that represents the absolute number of seconds from Jan 1, 1970 at 12:00 AM. We use a function that converts this number into a human readable date. I have tried encapsulating the result of the function in a TO_DATE and a TO_CHAR with the same results as before. There is something about the resulting data from the date convert function that Apex doesn't like. It would be interesting to isolate what exactly the issue is, but right now I'm just trying to find a work around.
    Thank You
    SCott

  • Issue with Date and Time Picker on Windows 8.1 and Office 2013

    We are having a issue with the Developer tool in Excel called the Date and Time Picker which puts a calendar field into the spreadsheet where you can select a date from the calendar popup
    If we change this date field and then save and close and reopen it reverts back to the previous date before the change was made
    If we do the same process on a Windows 7 machine with Office 2013 it saves the changes we made to the Date and Time picker field

    Hi jdono2,
    I'm using Windows 8 and Office 2013,and I can't reproduce this issue.
    Please make sure you have already upgraded to the latest version of excel, try to reregister the MSCOMCT2.OCX.
    Also you can try to use another date picker add-in to test this issue.
    http://social.msdn.microsoft.com/Forums/office/en-US/36f83f24-cd76-4f8e-aa7b-5f166666e7d3/excel-2013-popup-calendar?forum=exceldev
    Wind Zhang
    TechNet Community Support

  • Issue with Date format - ABAP to XML

    Dear Users,
    We are currently facing an issue with the date formats in XML.
    We have a system (.Net), which has a webservice that we are calling for information from SAP. We created a Proxy class in SAP from the WSDL file and have attempted to use the method that gets us required information based on the Timestamp passed from SAP. However, the timestamp that the INPUT structure uses has a data element XSDDATETIME_Z.
    All we can send from SAP is a simple TIMESTAMP, but the .Net system doesn't accept it since it wants the timestamp in XML format i.e. <dd-mm-yyyy>T<hh:mm:ss>Z. SAP documentation says that the field should automatically do conversion from ABAP to XML format, but that doesn't happen. We don't want to build a string from Timestamp in the XML format and send it out since we might surely miss out on the different cases involved.
    Can anyone please suggest a way for us to send the date out in the required XML format?
    Many thanks!

    Hi Vijay,
    Look at the below sample code and it works fine, i guess there is something wrong in your code or conversion, post the actual code if you are still not able figure it out with the below example.
    DATA: l_xml_string TYPE string,
          l_dat_time TYPE xsddatetime_z.
    CALL FUNCTION 'CACS_DATE_GET_TIMESTAMP'
    EXPORTING
       I_DATE                         = sy-datum
       I_TIME                         = sy-uzeit
    IMPORTING
       E_TIMESTAMP                    = l_dat_time
    EXCEPTIONS
       DATE_NOT_FILLED_BUT_TIME       = 1
       DATE_HAS_NO_VALID_FORMAT       = 2
       OTHERS                         = 3.
    IF sy-subrc <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    CALL TRANSFORMATION id
      SOURCE root = l_dat_time
      RESULT XML l_xml_string.
    IF sy-subrc EQ 0.
    write: l_xml_string.
    ENDIF.
    Regards,
    Chen

  • Issue with Data Load Table

    Hi All,
           i am facing issue with apex 4.2.4 ,using the  Data Load Table concept's and in this look up used the
          Where Clause option  ,it seems to be not working this where clause ,Please help me on this

    hi all,
        it looks this where clause not filter with 'N'  data ,Please help me ,how to solve this or help me on this

  • Issue with Date Format for Presentation Variables

    Hi,
    I am using dashboard prompts to capture begin date and end date in presentation variables. The dates selected from calendar are in the format mm/dd/yyyy.
    In Asnwers I need to get a count of days between begin and end dates. I am using the column formula as shown:
    TIMESTAMPDIFF(SQL_TSI_DAY, DATE '@{pBeginDate}', DATE '@{pEndDate}')
    When ever I run the report from dashboard I get the following error:
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 46046] Datetime value 1/1/2005 does not match the specified format. (HY000)
    SQL Issued: {call NQSGetQueryColumnInfo('SELECT "Transaction Dates"."Transaction Date", TIMESTAMPDIFF(SQL_TSI_DAY, DATE ''1/1/2005'', DATE ''1/2/2006'' FROM "Dates"')}
    SQL Issued: SELECT "Transaction Dates"."Transaction Date", TIMESTAMPDIFF(SQL_TSI_DAY, DATE '1/1/2005', DATE '1/2/2006') FROM "Dates"
    Can anyone help me to reolve this date format issue?
    Thanks,
    Aravind

    Hi,
    see this below links
    Issues with Prompts calender date
    Regards
    Naresh

  • Issue with Data Load to InfoCube with Nav Attrivutes Turned on in it

    Hi,
    I am having a issue with loading data to a InfoCube. When i turn
    on the navgational attributes in the cube the data load is failing
    and it just says "PROCESSED WITH ERRORS". when i turn them off
    the data load is going fine. I have done a RSRV test both on
    the infoobject as well as the cube and it shows no errors. What
    could be the issue and how do I solve it.
    Thanks
    Rashmi.

    Hi,
    To activate a navigation attribute in the cube the data need not be dropped from the cube.
    You can always activate the navigation attribute in the cube with data in the cube.
    I think you have tried to activate it in the master data as well and then in the cube or something like that??
    follow the correct the procedure and try again.
    Thanks
    Ajeet

  • An issue with Data rules

    Hi
    Im using 3 data rules viz., Is_number, is_foreign_key and Is_not_null.
    I have applied these rules to around 30 columns. When i deploy this mapping it is taking very long time to deploy as long as 48 hours. When i deploy the same mapping without the data rules its getting deployed in less than a minute.
    I tried to see the generated code it is around 18k lines of code.
    I donot know why it is taking such a long time to deploy. Im afraid what will happen if i test the mapping with data.
    Please suggest me on how to improve the performance.
    Regards
    Vibhuti

    Hi Venkat,
    If u do selective deletion for the month of August the data will get flushed from the active table only so from the next delta load the data will get updated from the change log table.
    First thing make sure what is the incorrrect data? And then create seperated infopackage and perfrom repair full request for the month of Aug only.Nothing to worry yours delta's wont get any affected.
    Your data is getting loaded from PSA and then to the Data targets?
    You know na how perform repair full request?
    Assign points if it helps.
    Urs,
    BI

Maybe you are looking for