OWB 9.0.3.37 and imported plsql packages

I have some problems in owb version 9.0.3.37 with imported packages.
I can't delete a procedure/function in the imported package,in the package body I can do this .
The only way do delete procedure/function is do delete the package and re-import it from the db into OWB.
Problem then is that when I valid a mapping I must reconcile inbound the transformation (link broken).
Is there not another way to do delete function/procedure from a imported package.
Is the import of packages in the new version 9.2 still possible?

Hi,
There is an internal reason (validation) why we cannot allow you to delete functions from a package that you imported. That model is still the same in the latest production release.
Thanks,
Mark.

Similar Messages

  • Using ant to build and deploy plsql packages to a 11g wls.

    Hi all,
    Not sure is this is the right place, but I found a few other threads in this forum regarding plsql and ant, so I'll try this one.
    My firm is moving from 10g ADF applcations to 11g SOA applications. When I'm not busy being lost in the whole new SOA world, I also have the pleasure of building our ant scripts. :-)
    It has been years since I looked at the deploy scripts I made for 10g, so I'm a bit rusty.
    I used wsassemble in our old scripts and I used http://download.oracle.com/docs/cd/B25221_05/web.1013/b14434/wsassemble.htm as a reference. But I can't seem to find the same documentation for 11g? I tried http://download.oracle.com/docs/cd/E12839_01/index.htm but it doesn't seem to have anything about plsqlAssemble, genInterface, genproxy etc. for 11g wls servers? I also tried google a while to no avail.
    For example, I'm having problems setting the values for the ant-oracle.properties.
    So can someone please throw some links my way, so I can read how to build and deploy plsql packages to a 11g wls server using ant? Thanks!
    Cheers,
    William

    Hi,
    Did you ever find an answer to this question ?
    I am currently creating the webservice using webservices assembler from 10g/OC4j, then using the SmartUpgrade tool to re-engineer the webservice for deployment to 11g
    ( SmartUpgrade info here http://docs.oracle.com/cd/E16764_01/doc.1111/e15878/intro.htm)
    G
    Edited by: user999097 on Mar 5, 2012 1:16 PM

  • BPC 7 SP3 and Import Excel package

    Hi,
    I use VPC 7 SP3 (with SQL Server 2005) and I have some problems with the standard package named "Import Excel". The following error is returned:
    Package Error Events: 
    ErrorCode = -1073450901
    Source = ExcelToText
    SubComponent= DTS.Pipeline
    Description = "component "EXCELFILE" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
    Here is the parameters:
    %XLFILE%     /ECOS/COMPTABILITE/DataManager/DataFiles//EXPORT_COMPTABILITE.XLS
    %SHEET%     COMPTABILITE_CIBLE
    %COLUMNS%     
    %TRANSFORMATION%     /ECOS/COMPTABILITE/DataManager/TransformationFiles//Transf_JDE_Compta_Actual.xls
    %CLEARDATA%     1
    %RUNLOGIC%     0
    %CHECKLCK%     0
    Any ideas?
    thanks,
    Romuald

    Hi,
    here is more information. The Excel file I try to load has different columns from the standard ones (I have an HYP dimension and no INTCO dimension). I have specified the list of columns in the package parameters interface. It seems that these values are not used by the package. I think that the package waits for a file with the standard columns, no less, no more. It works fine with BPC 4.2 (I use the standard Import Excel package and the file is well loaded).
    thanks for your help,
    Romuald

  • Import anonymous package ?

    Hi
    Im new here!
    How to import an anonymous package(without package packageName;)?
    i know where the classes are and the classpath is set to this dir. The packages which i want to import this classes are in sub-dirs of this dir.
    thank you
    jz
    (If you don't understand me please don't wonder 'cause i'm 13 and speaking german :) )

    How to import an anonymous package(without package
    packageName;)?
    i know where the classes are and the classpath is set
    to this dir. The packages which i want to import this
    classes are in sub-dirs of this dir.
    (If you don't understand me please don't wonder
    'cause i'm 13 and speaking german :) )Don't worry, you're doing more than fine. The technical stuff: you can't
    import classes from the default package and you shouldn't want to do
    that. Put all of your classes in the appropriate packages and import
    those packages (or single classes from those packages).
    If you insist of having a class in the default package, make it the single
    class with the 'main' method' that starts up the entire thing.
    kind regards,
    Jos

  • Import PLSQL procedure or package into OWB

    Here's a simple question. In schema X, I created a procedure and a package. If I try to import them into OWB (10g R2), connecting to schema X and clicking on transformations and "import" doesn't show the newly created objects for import. I tried granting execute to OWB10GR2, but no luck. Closed and re-started design center with no luck.
    Creating a procedure in Design Center and then deploying it to schema X works and is a workaround, but it's awkward to write code there (compile is deploy, which takes a lot longer than normal compile).
    What am I missing so that I can't import a procedure into design center?

    OK, so I'm going to answer my own question. Maybe someone else can use the info...
    The problem is that I have development and production configurations.
    Procedure or package defined only in one (dev or prod) schema won't show up on the import list. Apparently, in order to import the object into OWB you need to define it in all environments. Otherwise it doesn't show up on the import list.

  • Scripting oracle export and import dumps through PLSQL stored procedures

    Hello,
    I would like to know if it is possible to script oracle export and import dump commands in a PL/SQL package rather than at command prompt. Also, how can i copy the export dump files across the network to a specific location.
    I would really appreciate if someone can provide me with examples????
    OR
    If there is off the shelf solution for what i am trying to achieve?
    Many Thanks.

    Hello,
    there are many ways to do this:
    - Java with PL/SQL wrapper,
    - call C code as external procedure from PL/SQL,
    - DBMS_SCHEDULER has some features related to this as well,
    - or write your own logic: for example creating a new file with UTL_FILE could be the trigger of the export.
    Franky
    Edited by: Franky on Aug 10, 2009 4:25 AM - extended

  • Issue in OWB mapping - when changing source and target database

    Hi,
    I need help for resolution of the issue I am facing when moving mapping from development environment to QA.
    Here is situation,
    We develop ETL using one source, one staging and one target database.
    In development we use one control_center for Source to staging and another control center staging to target.
    All works fine in development.
    Now I have created new runtime repository and imported all OWB projects (with full dependencies, exact replica of development). Now I need to change source and staging and target as different database.
    I have created new database location connections and defined/attached DB connectors with stage and target location.
    Now issues are
    1.     Two Staging mapping are not able to bound with source table (giving different error
    a.     One mapping show error for source synonym translation no longer valid when deploying this mapping , but validation comes without any issue)
    b.     Other mapping show error for source table/object not bound to repository
    2.     All the target mapping show validation successful, but when deploying says “table or view does not exist”. But tables are exists on source stage and target. (also permission are set correctly for target user to read from staging tables).
    Not sure how to proceed from here.
    I have recreated new repository and re-imported all project/mappings and defined all connection but still same issue.
    Thanks in advance,
    Vipin

    1. Two Staging mapping are not able to bound with source table (giving different error
    a. One mapping show error for source synonym translation no longer valid when deploying this mapping , but validation comes without any issue)
    b. Other mapping show error for source table/object not bound to repository
    The above error were resolved when re-synchronized the table (for few I have to reimport the table) and mapping.
    2. All the target mapping show validation successful, but when deploying says “table or view does not exist”. But tables are exists on source stage and target. (also permission are set correctly for target user to read from staging tables).
    The above error still pending. My target mapping are not able to deployed/compiled.
    For the above I have defined one Staging location to one target target location and target location have connector to staging (not sure if I have to define connector name same as staging location, as I have created DB connector with different name but reference database is same as staging location).
    Mapping are assoicated with desired data locatoin and meta data.
    control center is also have that data location.
    Mapping are configured for the desired location.

  • OWB Can't Import Existing Package with Collection Data Type as Arguments

    I created a package and compiled it successfully as the follows:
    +++++++++++++++++++++++++++++
    CREATE OR REPLACE PACKAGE PACKAGE_TEST AS
    TYPE Num_LIST IS TABLE of NUMBER INDEX BY BINARY_INTEGER;
    FUNCTION TEST_FUNCTION1 ( arg1 NUMBER, arg2 NUMBER) RETURN NUMBER;
    FUNCTION TEST_FUNCTION2 ( args NUM_LIST) RETURN NUMBER;
    END PACKAGE_TEST;
    CREATE OR REPLACE PACKAGE BODY PACKAGE_TEST AS
    FUNCTION TEST_FUNCTION1 ( arg1 NUMBER, arg2 NUMBER) RETURN NUMBER IS
    ln_sum NUMBER;
    BEGIN
    ln_sum := arg1 + arg2;
         RETURN ln_sum;     
    EXCEPTION
    WHEN OTHERS THEN
    RETURN -10000;     
    END TEST_FUNCTION1;
    FUNCTION TEST_FUNCTION2 ( args NUM_LIST) RETURN NUMBER IS      
         ln_sum NUMBER;
    BEGIN
    ln_sum :=0;
         FOR i IN 1.. args.COUNT LOOP
         ln_sum := ln_sum + args( i );     
         END LOOP;     
         RETURN ln_sum;     
    EXCEPTION
    WHEN OTHERS THEN
    RETURN -10000;     
    END TEST_FUNCTION2;     
    END PACKAGE_TEST;
    ++++++++++++++++++++++++++++++++++++++++
    Then I tried to import the package into OWB (9.2). The problem came: all the package body and
    the specification of TEST_FUNCTION1 were imported, but the package specification of
    TEST_FUNCTION2 couldn't be imported. The IMPORT FILTER gave an error message:
    "TEST_FUNCTION2: Argument Data type is not supported".
    How to resolve this problem? It is very import for our project, since we have to import some
    packages with user_defined collection data types as arguments.
    Thank you.
    Lushu

    Hi Lushu,
    Unfortunately this is indeed not supported and I would not know how to work around this one. The only way I guess that this would work is to not import it, and then use Expressions to call the function with inputs.
    Jean-Pierre

  • Export and Import of mappings/process flows etc

    Hi,
    We have a single repository with multiple projects for DEV/UAT and PROD of the same logical project. This is a nightmare for controlling releases to PROD and in fact we have a corrupt repository as a result I suspect. I plan to split the repository into 3 separate databases so that we have a design repos for DEV/UAT and PROD. Controlling code migrations between these I plan to use the metadata export and subsequent import into UAT and then PROD once tested. I have used this successfully before on a project but am worried about inherent bugs with metadata export/imports (been bitten before with Oracle Portal). So can anyone advise what pitfalls there may be with this approach, and in particular if anyone has experienced loss of metadata between export and import. We have a complex warehouse with hundreds of mappings, process flows, sqlldr flatfile loads etc. I have experienced process flow imports that seem to lose their links to the mappings they encapsulate.
    Thanks for any comments,
    Brandon

    This should do the trick for you as it looks for "PARALLEL" therefore it only removes the APPEND PARALLEL Hint and leaves other Hints as is....
    #set current location
    set path "C:/TMP"
    # Project parameters
    set root "/MY_PROJECT"
    set one_Module "MY_MODULE"
    set object "MAPPINGS"
    set path "C:/TMP
    # OMBPLUS and tcl related parameters
    set action "remove_parallel"
    set datetime [clock format [clock seconds] -format %Y%m%d_%H%M%S]
    set timestamp [clock format [clock seconds] -format %Y%m%d-%H:%M:%S]
    set ext ".log"
    set sep "_"
    set ombplus "OMBplus"
    set omblogname $path/$one_Module$sep$object$sep$datetime$sep$ombplus$ext
    set OMBLOG $omblogname
    set logname $path/$one_Module$sep$object$sep$datetime$ext
    set log_file [open $logname w]
    set word "PARALLEL"
    set i 0
    #Connect to OWB Repository
    OMBCONNECT .... your connect tring
    #Ignores errors that occur in any command that is part of a script and moves to the next command in the script.
    set OMBCONTINUE_ON_ERROR ON
    OMBCC "'$root/$one_Module'";      
    #Searching Mappings for Parallel in source View operators
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
    foreach mapName [OMBLIST MAPPINGS] {
    foreach opName [OMBRETRIEVE MAPPING '$mapName' GET TABLE OPERATORS] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT)] {
    if { [ regexp $word $prop1] == 1 } {
    incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET DIMENSION OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET CUBE OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET VIEW OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    if { $i == 0 } {
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
         } else {
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
    close $log_file;
    Enjoy!
    Michel

  • Importing Database Package into OWB changes position of parameters .. Nasty

    Hi all,
    Anyone else having the same problem ? Any solutions ? Or is this a bug.
    When a package is imported from a DB schema, OWB kindly changes the order in which the parameters are listed.
    The effect is that any mapping using the procedure or function will not have the correct parameters passed to.
    For example.
    DB_Test_proc
    io_name_in in out varhar2,
    o_clean_name_out out varchar2
    After importing in OWB the parameter positions are reversed .... very nasty.
    Anyone can help or do I raise a bug ?
    Regards
    Nick

    Hi Borkur,
    Dropping and re-importing the package would work well (works for another module) but this is a package whose procedures and functions are heavily referenced in several mappings and process flows (supports the custom job management). So, I would be forced to re-synchronize them all which I want to avoid. The other way would be to modify the package manually (only the signature is needed not the code since I will not deploy it from OWB, in fact, I deployed it from Oracle Designer !!!). But OWB Paris doesn't allow this. Once imported, always imported. You can modify imported tables, procedures, functions but not packages. Or is there a way ???
    regards
    Thomas

  • Importing a package into OWB repository.

    Hi,
    there is a package existing in my OWB repository.
    Package ABC
    Procedure PQR(pram1 varchar2,param2 varchar2)
    now i have modified this package in my target schema, an extra parameter is added to the procedure PQR.
    Procedure PQR(pram1 varchar2,param2 varchar2,param3 number).
    now when I re import this package to my OWB repository module. it shows both versions of the procedures.
    Package ABC
    PQR(pram1 varchar2,param2 varchar2)
    PQR(pram1 varchar2,param2 varchar2,param3 number)
    PQR(pram1 varchar2,param2 varchar2) is still vsible in OWB repository though it is replaced with PQR(pram1 varchar2,param2 varchar2,param3 number) in target schema.
    Is there any way I can replace the old version of procedure with the new version in OWB repository.
    Regards
    RD_RBS

    Hi
    Standard approach to be followed for any object modification
    1. If you have created your package from OWB and deployed it into your DB, always do it that way i.e. Design center -->DB . Do not make any modifications on DB level and import in back to your Design center. OWB manages Ids for all objects on the repository and this will confuse it. That is why you see two versions of the same package . Any modifications should be done on Design center level and then do a replace on the package from control center
    2. If an object (package or procedure) is created on DB directly and is imported into design center, follow the same norm everytime you make any changes to that object. The status of that object will appear as "New" or "Not deployed" on control center but that is fine because OWB has no ID on its repository to maintain this information.
    3. If you need to rename any object already deployed from OWB, always drop it first, then rename it and then deploy it. That way OWB will maintain the name and ID on its end and not get confused.
    4. For your case, you can drop the older version of the package from OWB, delete the one imported from Database into OWB, make the changes on OWB level (adding parameter etc) and then redeploy the same package as replace from control center.
    Hope these tips help
    birdy

  • How do I save and import my bookmarks from another hard drive? When I try to open the installed Firefox on the old drive, it (obviously) opens a browser from the new main drive, free of bookmarks. Is there a way I can save the bookmarks on the old drive w

    How do I save and import my bookmarks from another hard drive? When I try to open the installed Firefox on the old drive, it (obviously) opens a browser from the new main drive, free of bookmarks. Is there a way I can save the bookmarks on the old drive without opening a browser?
    The guts of my computer were rearranged and I got a new main hard drive. My old one is still in there and I can get stuff from it, but when I go to the Mozilla folder on the old one, I can't figure out if there's anything I can do to get all my bookmarks from that drive to my new one, where Firefox is newly installed.

    If you open Firefox then Firefox will always use the default profile folder as found via profiles.ini on your system drive.
    You either need to import the file in your current default profile or copy the file to your current profile folder while Firefox is closed.
    Firefox 3 stores the bookmarks and the browser history in [http://kb.mozillazine.org/places.sqlite places.sqlite] and no longer creates an HTML backup by default.
    There are also (five) JSON backups in the bookmarkbackups folder within the Firefox profile folder.
    You can either copy the file places.sqlite to your [http://kb.mozillazine.org/Profile_folder_-_Firefox Firefox Profile Folder] or import the most recent JSON backup from the bookmarkbackups folder of that old profile.
    See:
    http://kb.mozillazine.org/Backing_up_and_restoring_bookmarks_-_Firefox
    http://kb.mozillazine.org/Transferring_data_to_a_new_profile_-_Firefox
    See http://kb.mozillazine.org/Profile_folder_-_Firefox
    "Application Data" in XP/Win2K and "AppData" in Vista/Windows 7 are hidden folders.
    See http://kb.mozillazine.org/Show_hidden_files_and_folders
    Go to: Control Panel > Folder Options > "View" tab > under "Hidden files and folders", select "Show hidden files and folders".
    You may want to un-check the box "Hide extensions for known file types" to see the file extensions of all files.

  • How to update link and import data of relocated incx file into inca file?

    Subject : <br />how to update link and import data of relocated incx file into inca file.?<br />The incx file was originally part of the inca file and it has been relocated.<br />-------------------<br /><br />Hello All,<br /><br />I am working on InDesignCS2 and InCopyCS2.<br />From indesign I am creating an assignment file as well as incopy files.(.inca and .incx file created through exporing).<br />Now indesign hardcodes the path of the incx files in inca file.So if I put the incx files in different folder then after opening the inca file in InCopy , I am getting the alert stating that " The document doesn't consists of any incopy story" and all the linked story will flag a red question mark icon.<br />So I tried to recreate and update the links.<br />Below is my code for that<br /><br />//code start*****************************<br />//creating kDataLinkHelperBoss<br />InterfacePtr<IDataLinkHelper> dataLinkHelper(static_cast<IDataLinkHelper*><br />(CreateObject2<IDataLinkHelper>(kDataLinkHelperBoss)));<br /><br />/**<br />The newFileToBeLinkedPath is the path of the incx file which is relocated.<br />And it was previously part of the inca file.<br />eg. earlier it was c:\\test.incx now it is d:\\test.incx<br />*/<br />IDFile newIDFileToBeLinked(newFileToBeLinkedPath);<br /><br />//create the datelink<br />IDataLink * dlk = dataLinkHelper->CreateDataLink(newIDFileToBeLinked);<br /><br />NameInfo name;<br />PMString type;<br />uint32 fileType;<br /><br />dlk->GetNameInfo(&name,&type,&fileType);<br /><br />//relink the story     <br />InterfacePtr<ICommand> relinkCmd(CmdUtils::CreateCommand(kRestoreLinkCmdBoss)); <br /><br />InterfacePtr<IRestoreLinkCmdData> relinkCmdData(relinkCmd, IID_IRESTORELINKCMDDATA);<br /><br />relinkCmdData->Set(database, dataLinkUID, &name, &type, fileType, IDataLink::kLinkNormal); <br /><br />ErrorCode err = CmdUtils::ProcessCommand(relinkCmd); <br /><br />//Update the link now                         <br />InterfacePtr<IUpdateLink> updateLink(dataLinkHelper, UseDefaultIID()); <br />UID newLinkUID; <br />err = updateLink->DoUpdateLink(dl, &newLinkUID, kFullUI); <br />//code end*********************<br /><br />I am able to create the proper link.But the data which is there in the incx file is not getting imported in the linked story.But if I modify the newlinked story from the inca file,the incx file will be getting update.(all its previous content will be deleted.)<br />I tried using <br />Utils<IInCopyWorkflow>()->ImportStory()<br /> ,But its import the incx file in xml format.<br /><br />What is the solution of this then?<br />Kindly help me as I am terribly stuck since last few days.<br /><br />Thanks and Regards,<br />Yopangjo

    >
    I can say that anybody with
    no experience could easily do an export/import in
    MSSQLServer 2000.
    Anybody with no experience should not mess up my Oracle Databases !

  • Difference between empty plsql record and null plsql record

    Hi there,
    I am kinda getting confused with empty plsql record and null plsql record.
    How do I assign plsql record to be empty and to be null?
    create type emp_obj as object (enum number, ename varchar2);
    CREATE OR REPLACE TYPE emp_type AS TABLE OF emp_obj;
    Thanks

    First of all, do not use term PL/SQL record in this context. Record type in PL/SQL is completely different from object type. Secondly, there are 2 states of a nested table:
    1. Unintialized:
    SQL> create or replace
      2    type emp_obj_type as object(enum number, ename varchar2(10));
      3  /
    Type created.
    SQL> create or replace
      2    type emp_tbl_type as table of emp_obj_type
      3  /
    Type created.
    SQL> declare
      2      v_emp_tbl emp_tbl_type;
      3  begin
      4      v_emp_tbl.extend;
      5  end;
      6  /
    declare
    ERROR at line 1:
    ORA-06531: Reference to uninitialized collection
    ORA-06512: at line 4
    SQL> 2 Empty:
    SQL> set serveroutput on
    SQL> declare
      2      v_emp_tbl emp_tbl_type := emp_tbl_type();
      3  begin
      4      dbms_output.put_line('Nested table v_emp_tbl has ' || v_emp_tbl.count || ' element(s).');
      5  end;
      6  /
    Nested table v_emp_tbl has 0 element(s).
    PL/SQL procedure successfully completed.
    SQL> NULL aplies to nested table element, not to nested table itself:
    SQL> declare
      2      v_emp_tbl emp_tbl_type := emp_tbl_type();
      3  begin
      4      v_emp_tbl.extend;
      5      if v_emp_tbl(1) is null
      6        then
      7          dbms_output.put_line('Nested table v_emp_tbl first element is NULL.');
      8      end if;
      9  end;
    10  /
    Nested table v_emp_tbl first element is NULL.
    PL/SQL procedure successfully completed.
    SQL> SY.

  • Question regarding Export and Import

    First let me say that any software that comes without a save button should be sold with a warning label.
    Question 1:  I have having an issue comprehending how to save a photo.  In my case  I select the photo, zoom in on the subject, export it to my desktop. The pciture on my desktop does not incorporate the change. Am I missing a step? What do I need to do to export it with this change? I actually watched a You Tube video on this and could not see what i was not doing.
    Question 2: I just installed Lightroom and am trying to import my 12k strong photo collection. The Import button pulls in about 2k and then cannot find anymore. The photos are stored in folders by date within a master folder. I am selecting the master folder. I can go in and import the sub-fodler individually. However i do not want to do that 200 times.There is no apparent way to go into the subfolder level and select more than one folder.
    Can anyone help me upgrade my opinion of Lightroom from it's current level of POS to usuable?

    Ihatelightroom wrote:
    First let me say that any software that comes without a save button should be sold with a warning label.
    Why?
    Question 1:  I have having an issue comprehending how to save a photo.  In my case  I select the photo, zoom in on the subject, export it to my desktop. The pciture on my desktop does not incorporate the change. Am I missing a step? What do I need to do to export it with this change? I actually watched a You Tube video on this and could not see what i was not doing.
    You must have selected the wrong option in the Export dialog box. Under "File Settings", you need to select JPG and not "Original". Of course, you probably need to do some additional viewing of videos (or some reading) to learn that most people's workflow does not automatically include a "Save" or "Export" after editing the photo. It's not a necessary part of Lightroom's workflow, unless you need the photo for some non-Lightroom activity.
    Question 2: I just installed Lightroom and am trying to import my 12k strong photo collection. The Import button pulls in about 2k and then cannot find anymore. The photos are stored in folders by date within a master folder. I am selecting the master folder. I can go in and import the sub-fodler individually. However i do not want to do that 200 times.There is no apparent way to go into the subfolder level and select more than one folder.
    In the Import dialog box, on the left, under "Source", there is a checkbox that says "Include SubFolders". Make sure this is checked.
    Seriously, you need to spend some time reading introductory material about LR because Lightroom does not work like any other photographic software you might have used in the past. You are handling it as if it was no different than standard photo editing software, and you are going to be frustrated if that is your mindset. See the videos at adobe.tv and read this: http://www.flickr.com/groups/adobe_lightroom/discuss/72157603590978170/

Maybe you are looking for

  • To caluclate the opening balances

    I want to calulate the opening balances and closing balances for each month in the result like this : Month uom Opening Closing Apr May Jun Jul Aug sep Oct Nov Dec Jan Feb Mar SELECT UOM, SUM(OPSTK) OPSTK, SUM(OPVAL) OPVAL SUM(INQTY) INQTY, SUM(INVAL

  • Deleting a text file

    Ok, I am back, again. I searched the web on how to Delete a text file completely, and came to a thread on these forums. But there was one issue, there was no answer in the thread. I know that the File class has a delete method, but I am not sure how

  • HT4436 Do you have to pay for icloud on your iphone?

    Do you have to pay for icloud on your iphone? From a Newbie

  • Table executes query on page load

    I have a popup in my page and a table in my popup... I use this popup for searching among records. When I render my popup, table comes with the sql suery results (invokes in popup load)... But I don't want the table come with the query results on loa

  • Family Member/Dependents Dynamic Overview

    Hi all. Since there is a gap in the Netweaver 2004 concerning the ESS in MSS I'm developing the same mechanism but from scratch, using the standard RFC's used in the ESS module. However I'm having some difficulties in approaching the "Family Member/D