Dynamic Convertor - conversion issues

HI Experts,
I am using dynamic converter to convert WORD Documents into HTML in SiteStudio 10gr4
In GUI template I have set Global option - "User this target for external links" as "_blank".
It working fine for external URLs. all external URLs are opening in new window. But internal links are also opening in new window. I want internal links to be opened in same window.
e.g. My website URL domain is www.example.com. and in word document I have link like http://www.example.com/abc.html etc. I want all internal links to be opened in same window.
Help Please. We have release this week :(

Hi
When you set the DC template and layout and set the same for sections on sites it is kind of global to that section of the page . So you might need to change the code or put in a condition to check for internal / external links to have the window setting for the same .
Thanks
Srinath

Similar Messages

  • Conversion issue in encoding in PI

    Hi,
      I am having a problem in conversion issue for encoding. Let's have a detailed look into the scenario first.
    Scenario: SNC->PI (through Proxy)->MQ (through Receiver JMS adapter)->SeeBeyond (It's a middleware system).
                   Here from SNC, data are coming into PI as in UTF-8 encoding, whereas all the systems shown in the above flow are of ISO 8859_1 format. Hence UTF-8 needs to be converted into ISO 8859_1. PI does this conversion in Receiver JMS adapter. CCSID has been set to 00819 which is for ISO 8859_1 encoding. But the beginning of the xml, it remains still UTF-8, like below:-
      <?xml version="1.0" encoding="utf-8" ?>
    Whereas the content of the xml has already been converted into ISO 8859_1 by Receiver JMS adapter. Now let's see the problem now.
    Problem: After the data been passed in SeeBeyond, the mapping fails in SeeBeyond, since SeeBeyond tries to look at the XML tag from the beginning and it found it's in UTF-8 as per he tag above. Then it tries to map it in UTF-8 format whereas the content is in ISO 8859_1, so mapping fails in SeeBeyond. Please note, here allthe systems except SeeBeyond are just bypassing the data, no mapping been introduced in anywhere in the systems except SeeBeyond.
    Workaround: Is there any other chance where we can change the tag header from UTF-8 to ISO 8859_1 in Receiver JMS adapter itself in PI like this:-
    <?xml version="1.0" encoding="iso 8859_1" ?>
    This might resolve our problem I think. Any suggestions would be appreciated.
    Thanks and Regards
    Soumya

    Hi Stefan,
                Thanks for your reply. I have gone through the link you have specified. Ya, that's the other way to resolve that issue in that case. But here we have asked the receiver system (SeeBeyond) to change the encoding part in their mapping. Because changing the coding part in SeeBeyond is less time consuming than adding a new Bean in JMS adapter. Our problem got resolved by changing the mapping in Receiver end. Thanks anyway for your valuable reply.
    Thanks
    Soumya

  • Umlaut Conversion issue in Sender communication channel SAP PI

    Hi Gurus,
    We are facing issue while conversion.
    umlaut Conversion issue in Sender communication channel that is reason channel not able to pic the file from the path.
    Sender CC error:
    Value of incoming field is too large. Segment:'IMD', Field:'7008', MaxLength:'35', value:'Plssvvkbecher Lübzer 0,4 (1280Stk p' DESCRIPTION: The length of the field value is too big !
    Actually field contains 35 Chars only the value is  : 'Plssvvkbecher Lübzer 0,4 (1280Stk p'
    We tried with “ISO-8859-1” if the field value is less than 35 characters it is able to converting.
    Please help me out from this issue.
    Thank you.
    Regards,
    Jittu.

    Hi Jittu,
    Have you tried using the codepageconversion bean in the modules under the sender?  It would be like:
    AF_Modules/TextCodepageConversionBean with a parameter of Conversion.charset and a value of utf-8.
    Regards,
    Ryan Crosby

  • Language conversion issue in PI7.1

    Hi,
    My sceanrio is file to Idoc. Am facing language conversion issues.
    For example one of the field in the source file is having the value "Différence sur net". And in the IDOC field the value is appearing as "Différence sur net".
    With the same data I have done file to file scenario with and without content conversion and the output is coming correctly as "Différence sur net".
    I have enabled Unicode check box in RFC destination also in XI server. But still its coming "Différence sur net" . Please advice how can I get the target field value same as that of the source field vale.
    Can anyone help me to sort out this issue?
    Thanks.
    Dinesh

    Hi,
    In Channel try with using the File type: text and encoding : UTF-8.
    I think this will solve your issue.
    Thank you,
    Siva

  • Currency conversion issue in SPM. We are getting incorrect results with SPM conversion function from one of the document currency to USD.

    Currently we are using SPM 2.0 version and we have been facing currency conversion issues.
    Please help me in following aspects.
    1) Where actually currency conversion happens in SPM. Is it the global program which does the conversion or other way.
    2) We have conversion issue for one of the currency where conversion function is giving incorrect results when converting from one of the document currency to USD. here The respective document currency is considering the 1:1 ratio with Dollar which is actually incorrect.
    3) We have verified in both BI side(currency tables) and even ECC side.
    Please help me in understanding this issue and let me know if you need more information on this.
    Its an production issue and appreciated your immediate inputs.
    Thanks
    Kiran

    Hi Arun,
    The following information may be helpful to you.The SSA_HELPER_PROGRAM has options regarding currency settings.
    EXCH_RATE_TYPE: This flag governs the exchange rate type which will be used for currency conversion in data management. For example if RSXAADMIN contains an entry EXCH_RATE_TYPE = „ZSPM‟ then the conversion type used for currency conversion is ZSPM. The default value for the exchange rate type is „M‟. More details can be found in the note 1278988.
    CURRENCYCONVERSION: By default data management converts all the measures in transaction currency to reporting currency and copies over to the corresponding measure in reporting currency. If the measure in reporting currency is already available in source it might be desirable to disable the currency conversion. To disable the conversion you can make an entry CURRENCYCONVERSION = „ „ in the table RSXAADMIN. This can also be achieved by running the program SSA_HELPER_PROGRAM with the option DEACTIVATE_CURRENCYCONVERSION. The conversion can be reactivated by running the same program with option ACTIVATE_CURRENCYCONVERSION.
    UNITCONVERSION: Similar to above. To deactivate unit conversion you can use the program with option and DEACTIVATE_UNITCONVERSION and to reactivate ACTIVATE_UNITCONVERSION. By default both the conversions are switched on
    EXTERNAL_CURRENCIES: Normally most of the international currencies are stored with two decimal places however certain currencies do have 0 and 1 decimal place too. For example JPY has 0 decimal places. SAP internal format stores even these currencies with 2 decimal places and at the time of display it changes the value to right decimal places. In case a file from external source is loaded to SPM it might have the format with 0 decimal places in the file. To convert it to SAP standard format post processing needs to be done on this value. If that is the case you can set the flat EXTERNAL_CURRENCIES = „X‟ in the table which will enable the post processing for these values. This flag can also be set and reset using the helper program using the option TURNON_EXT_CURRENCY_FORMAT and TURNOFF_EXT_CURRENCY_FORMAT.
    Kind Regards,
    John Harris
    Senior Support Engineer, SAP Active Global Support

  • UCM Dynamic Convertor Issue

    Hi All,
    When adding a default GUI and/or Layout template to the Dynamic Converter - Configuration settings, the template takes effect but works in an AD-HOC manner and does not convert all the documents according to configuration. (Created as specified by the accompanying Oracle ECM guides.)
    In addition, when attempting to implement and use Dynamic Converter - Template Selection Rules, these seem to override the AD HOC occurrence of the conversion mentioned above and does not apply any changes as per the specified template. (It provides the HTML Rendition without applying the template created.)
    Any ideas on how to overcome this issues? Anyone?
    Have Uninstalled, Re-installed, Cleared and Reloaded Templates and still no luck.
    We are running UCM 10g with latest patches for the core system and for the DC component.
    Thanks

    Thanks for reply.
    We do have the latest version from Metalink(8.1.0.715).
    Understand about the caching of results. But when we create new rules and check in new content, it doesnt apply any rules/templates and just does a standard conversion.
    The templates work if there are no rules in system and we apply at the configuration settings as the default, but need the different templates for diofferent business units.
    On side note, it does apply the templates as required from site studio/wcm, just not from ucm rules for the dynamic converter.
    Thanks

  • Currency Conversion Issue

    HI experts,
    i have an issue with my currency report,
    where user want see the sales volume data converting from CAD to US $.
    when i checked the query the conversion type is Avg rate M source to CAD and the Target currency is CAD only.
    but when he execute the report he want to see the volume data in USD $.
    and when i execute the report i can see below error.
    Could not find any data to display. This might be due to the current selection of variable or filter values.
    can some please help me out this issue.
    thanks
    Raj

    Hi venu
    thanks for the post,
    there's no selection criteria being used by the user, has web link, once he click on link, he directly checking the report.
    so there's no section screen available.
    and there's no variable created for the  currency conversion type .
    here my requirement is to display currency in USD $ from CAD $.
    target currency maintained as CAD in RSCUR tcode.
    how would i do this,
    and again the same report is being used by CAD user also, so view shouldn't be change for the both users.
    what im thinking is
    is there any possibility to create variable on currency to select the required currency dynamically by user in selection screen
    please let us know possible solutions like without changing the view for CAD users
    please let me know for more info
    regards
    raj

  • XML Publisher - Dynamic Data Columns Issue

    Hi,
    I am creating a amortization report where I need to show the amortization schedule for unearned revenue. Customers have signed different years of contract and I need to show the amount as per their remaining contract months. So if there are 2 month remaining in the contract then I should show the data for 2 months only and if there are 10 months then I should display 10 months. If we run the report for both customers then it should display two lines in report. For first customer the data should be only for 2 months and for second row(customer) there should be data for next 10 months. These months should be columns in the report(like excel columns, not rows). I need to display all months in columns instead of rows.
    I have acheived that using Dynamic Data Columns as mentioned in the user guide. Everything is working fine except following issues -
    1. How to create page level total?_
    I have created page total in my template using <?add-page-total:TOTUREV;'UREV'?> and displaying using <?show-page-total:TOTUREV?> but when I run the report in excel format these page total do not display. These columns are static columns not dynamic.
    If I run the report in pdf format then the static column total is displayed correctly.
    2. When I run the report in excel format then report run fine and it shows all the columns properly but if I run the report in pdf format then the dynamic column are not displayed in their own columns, all the columns are overwriting each other in a single columns itself, its not expanding.
    3. How do I get the total for the dynamic columns?_
    I need to display the page level total for the dynamic columns also, how do I do that?
    4. When I run the report in excel format the 2 decimal places of the numbers are gone(it works fine in pdf output), like 12.50 becomes 12.5 and 14.00 becomes 14. I need to maintain those 2 decimal places. I have tried using <fo:bidi-override direction="ltr" unicode-bidi="bidi-override"><?format-number:CVALUE;'999999D99'?></fo:bidi-override> but this does not solve my problem completely. It shows the values correctly in excel but then I am not able to do any calculation on those columns, looks like it converts them to text values.
    Any help is really appreicated. Please let me know if you need the xml template and data file.
    Regards
    Hitesh

    Hi Hitesh,
    Can you please upload your RTF and XML template file? I will try to spend sometime on this issue.
    Cheers
    Sachin

  • Conversion Issues

    I am upgrading software.  I used to use MS Office Pro 2003--> now 2007.  In converting documents from 2007 into Acrobat Pro 8.x, I was having a problem with the page numbers not getting converted properly.  For example, I use Page x of y at bottom of docs and the conversion was giving Page x of x.
    SO, I figured it could be time to upgrade to 9.0 pro.  I am currently using the trial version.  Well, I no longer have the problem above but I have another bigger problem.  The conversion is changing the fonts of most text that is centered.  I could live with that some of the time, however, certain documents must retain the specific font I use.  This is pretty much a deal breaker if I cannot fix this.
    Any thoughts?
    Many thanks and Happy Halloween.

    Ok, I figured out the CAUSE of the issue.  If a font is regular and not bolded in the font box, it converts correctly when bolded by the "B".  However, if it is bolded through the bold box, it converts to another font.  SO, is it word or adobe that is the culprit??  Therefore, what is the SOLUTION?

  • Currency conversion issue while creating PO from Shopping cart

    Hello Experts,
    I'm facing an issue in SRM during Limit purchase order creation from Limit shopping cart.
    Scenario:
    Shopping cart was created on 1st of June and Approved on 20th June & Po was created on the same day. But the currency conversion is no ware matching and couldn't able to trace out the same. I have tried several times to replicate the issue it never done in test environments.
    But the PO is creating with refer to Vendor currency. For example if SC was created in USD and vendor order currency in EUR then PO is created in EUR.
    the relevant notes were already updated in the system and it is working fine when the SC is created and approved on the same day.
    Please give me some idea how to fix this issue.
    Is there any way to make SC currency is priority than Vendor Order Currency while creating PO?
    Regards
    Pratap J

    Hello,
    Read information available in KBA 1862453.
    It mentions this issue.
    Regards,
    Ricardo

  • Crystal Data Conversion Issue (Error converting data type varchar to datetime)

    Hi,
    I can run stored procedure without error in SQL Server using my personal credentials as well as database credentials.
    I can also run Crystal Report after connecting to Stored procedure without error on my desktop using my personal credentials as well as database credentials.
    But when I upload the crystal report in BOBJDEV and when I run using database credentials report fails saying that "Error in File ~tmp1d1480b8e70fd90.rpt: Unable to connect: incorrect log on parameters. Details: [Database Vendor Code: 18456 ]" but I can run the crystal report successfully on BOBJDEV using my personal credentials.
    I googled (Data Conversion Error Message) about this issue & lot of people asked to do "Verify Database" in Crystal Report. So I did that, but when I do it I am getting a error message like this:
    Error converting data type varchar to datetime.
    Where do you think the error might be occurring? Did anyone faced this kind of issue before? If so, how to resolve it?
    (FYI, I am using Crystal Reports 2008, & for stored procedure I have used SSMS 2012 )
    Please help me with this issue.
    Thanks & Regards.
    Naveen.

    hello Naveen,
    since the report works fine in the cr designer / desktop, we need to figure out where you should post this question.
    by bobjdev do you mean businessobjects enterprise or crystal reports server? if so please post this question to the bi platform space.
    -jamie

  • Dynamic Local User Issue

    When i look at snapins thru consoleone i can see that Zenwork 7.0.1 snapin is installaed.
    I have Novell Client 4.91 SP5 and Zenwork Client 7.0.173.91015 installed on the clients running WinXP Pro SP3.
    There is different of failures that happens..
    Senario 1:
    I install a Latitude D610 with a WinXP Pro SP3 original CD, from scratch. I only install the drivers for the LAN-card to get access to the network. I do not update windows updates etc.
    I install the Novell Client 4.91 SP5, after that i install Zenwork Client 7.0.173.91015. And apply some registry settings to make the novell client to use the "tab-function" and hide advanced settings etc..
    I have my eDir user "ADMIN1" with the policy package with settings to Dynamic Local User set to create a local user with name Admin, but im not using volatile user. So the local windows user Admin will be saved when logged out.
    I login once with my Admin1 user, it creates the local profile Admin from Default User (with the help of Zenworks, and the policy Dynamic Local User?). I restart the computer and login again, and the local profile Admin craches and create a new one from Default User but this local user profile is namned Admin.Computername.
    Ive tested this with atleast four other computers (different hardware) so it cant be a driver issue.
    Ive looked thru the local logs, and i cant find anything about any problem with reading the NTUSER.DAT as could be a problem to load the local profile.
    I even tested this senario when i update all windows updates etc, with two different version of the zenworks client and so on. ive been testning this for like 100 times now atleast. and same failure is happening. Ive even tested this in a virtual environment (vmware workstation).
    Senari 2:
    Like the problem descried above, in some cases it loss the connection or something with the zenwork server side and the zenwork client on the client computer... Since it does not attempt to use the settings from Dynamic Local User, becuase i got the windows login window, and i have to login to an already existing windows local account (with otherwords i cant login to the Admin-profile since i dont know the login information to this account since its created by zenworks / dynamic local user settings, and from the settings there you cant set a password, just the name and role of the windows accout that should be created)..
    And after a while i try again, and then the settings from Dynamic Local User passes by and log into the, (let me say) Zenworks created local user profile (set by Dynamic Local User settings).
    I wanna mention that all computers thats old, no reinstallation.. I can login to without problem, without any crashes of the Windows Local Profile.
    Ive succeded once without any Windows Local profile crash, rebooted this computer over and over again, and no failure. If you succed twice, it seems like its fine. But then i reinstalled this computer, just like i did to make it success. But this time it failed on the second try, and got a crashed profile....
    Its kinda old hardware to the server where i have my Zenworks, could that be the case? Could it be some timeouts?
    The concults i use to fix some problems in our environment updated zenworks from the serverside just before christmans.. Could it be any problems with some windows patch etc?
    Any help would be appreciated!
    // Jokohanho

    > installed on the clients running WinXP Pro SP3.
    <snip>
    > I restart the computer and login again, and the local
    > profile Admin craches and create a new one from Default User but this
    > local user profile is namned Admin.Computername.
    I only know of one XP SP3 issue that could cause this, but it involves a pw
    change and RP:
    "When you try to log on to a Windows XP SP3-based computer by using a
    roaming profile, the roaming profile cannot load."
    http://support.microsoft.com/kb/958058
    Regards
    Rolf Lidvall
    Swedish Radio (Ltd)

  • PSE 8 upgrade album conversion issue

    I upgraded my system to Windows 7 and since I was going to have to reload everything anyway, I upgraded to PSE 8 at the same time.
    Before the upgrade I saved my catalog files to my external drive and then moved them back to the C:\Program Files\Adobe folder  I put it there because I didn't see a catalog folder under the Elements Organizer folder
    Anyway, I then converted my catalog - renamed it "My Catalog 7.0" to differentiate
    The conversion says it was good and upon inspection all meta tags and albums seem okay.  The issue is that it didn't bring in the images from May 2009 onward.  I know those were in the catalog in 7.0 and had tags on them.
    Any sggestions on how to get those images (with their metadata) into PSE 8.0?
    Susan

    Thanks John
    You were right, I pulled the wrong file to convert (the one I used to convert 6 to 7)
    Thankfully I had a back up and was able to restore my correct 7 catalog.  All is good now.

  • Non English characters conversion issue in LSMW BAPI Inbound IDOCs

    Hi Experts,
    We have some fields in customer master LSMW data load program which can
    contain non-English characters. We are facing issues in LSMW BAPI
    method with non-English characters Conversion. LMSW steps read and
    conversion are showing the non-English characters properly with out any
    issue. While creating inbound IDOCs most of the non-English characters
    replaced with '#' and its causing issues in creating customer master data in
    system. In our scenario customer data with non-English characters in
    the first name, last name and address details. Any specific setting
    needs to be done from our side? Please suggest me to resolve this issue.
    Thanks
    Rajesh Yadla

    If your language is a unicode tehn you need to change the options  like IN SAP you need to change it to unicode  in the initial screen Customize local layout(ALT F12) options 118  --> Encoding ....

  • Unicode MDMP conversion issue with document management table.

    Hi,
    We are in a process of doing a unicode conversion for our ECC 6.0 MDMP non-unicode system. We have completed the scans and we found close to 14 million words not assigned to the codepages.
    Then we checked at the table level which table has the highest number of words. There is one custom table ZQMDOCS which is used to store documents (MS-word) documents those are test procedures for our labs.
    If we see the MS-word document in non-unicode we are able to get to the document and just did some dummy assignment and completed the import and on our unicode system if we try to open the document it is opening in a readable format.
    So the issue is the data what is being stored in the document is in english but the formating which is done is read as a special character in a unicode system and in SAP it stores raw data.
    Please suggest ways to resolve this issue or any possible workarounds for this. This is a very critical table ( 43,000 documents & close to 14 million words not assign to the code page)
    Thanks
    Junaid.

    Hi Venkat,
    Thanks a lot for your immediate response.
    The InfoObject 0DOC_TYPE was without conversion exit by default. but when data coming from R/3 it is converting and sending to BW So that's why i am planning to use conversion exit "AUART" in the info Object.
    I checked data in R/3 using RSA3 it is showing sales document type as "OR" and for the same transaction data when i checked in PSA it is showing as "TA".
    Could you please let me know if there any other options.
    Thanks in advance,
    Dara.

Maybe you are looking for