Some thoughts on issues on portal-building

I have been studying a few portal platforms to use in some prospective projects.
Quite often there is the issue of integrating a portal with an own developed content publishing system of ours.
With "publishing system" I refer to a system that can provide a
html-template based function where users can create/edit and preview dynamic texts which are usually stored in a database and then publish them on an external site. I understand that the portal has some kind of a repository where users can upload files, but are there any template / preview or publishing functions in the portal ?
Also, Can I easily change the presentation layer of the portal, I don't mean by using the admin-functions "edit stylesheet" or "edit layout", but suppose a customer give me a full site in html layout and I want to use oracle web portal for the underlying server logic but with the new site layout. Is this possible ? How tied up am I with the present portal presentation?
Another thing that also has suprised me is that there does not appear to be any java api's to navigate and retreive information from the repository (read content areas and documents). I had expected to find a full java-API similar to your IFS product API that would allow me to perform operations on the repository ? Is this planned in future api releases ?
I would really appreciate if anyone out there with any ideas or similar thoughts could drop me a line here ?
Greetings. Johan Ekman.

I have been studying a few portal platforms to use in some prospective projects.
Quite often there is the issue of integrating a portal with an own developed content publishing system of ours.
With "publishing system" I refer to a system that can provide a
html-template based function where users can create/edit and preview dynamic texts which are usually stored in a database and then publish them on an external site. I understand that the portal has some kind of a repository where users can upload files, but are there any template / preview or publishing functions in the portal ?
Also, Can I easily change the presentation layer of the portal, I don't mean by using the admin-functions "edit stylesheet" or "edit layout", but suppose a customer give me a full site in html layout and I want to use oracle web portal for the underlying server logic but with the new site layout. Is this possible ? How tied up am I with the present portal presentation?
Another thing that also has suprised me is that there does not appear to be any java api's to navigate and retreive information from the repository (read content areas and documents). I had expected to find a full java-API similar to your IFS product API that would allow me to perform operations on the repository ? Is this planned in future api releases ?
I would really appreciate if anyone out there with any ideas or similar thoughts could drop me a line here ?
Greetings. Johan Ekman.

Similar Messages

  • Some Thoughts On An OWB Performance/Testing Framework

    Hi all,
    I've been giving some thought recently to how we could build a performance tuning and testing framework around Oracle Warehouse Builder. Specifically, I'm looking at was in which we can use some of the performance tuning techniques described in Cary Millsap/Jeff Holt's book "Optimizing Oracle Performance" to profile and performance tune mappings and process flows, and to use some of the ideas put forward in Kent Graziano's Agile Methods in Data Warehousing paper http://www.rmoug.org/td2005pres/graziano.zip and Steven Feuernstein's utPLSQL project http://utplsql.sourceforge.net/ to provide an agile/test-driven way of developing mappings, process flows and modules. The aim of this is to ensure that the mappings we put together are as efficient as possible, work individually and together as expected, and are quick to develop and test.
    At the moment, most people's experience of performance tuning OWB mappings is firstly to see if it runs set-based rather than row-based, then perhaps to extract the main SQL statement and run an explain plan on it, then check to make sure indexes etc are being used ok. This involves a lot of manual work, doesn't factor in the data available from the wait interface, doesn't store the execution plans anywhere, and doesn't really scale out to encompass entire batches of mapping (process flows).
    For some background reading on Cary Millsap/Jeff Holt's approach to profiling and performance tuning, take a look at http://www.rittman.net/archives/000961.html and http://www.rittman.net/work_stuff/extended_sql_trace_and_tkprof.htm. Basically, this approach traces the SQL that is generated by a batch file (read: mapping) and generates a file that can be later used to replay the SQL commands used, the explain plans that relate to the SQL, details on what wait events occurred during execution, and provides at the end a profile listing that tells you where the majority of your time went during the batch. It's currently the "preferred" way of tuning applications as it focuses all the tuning effort on precisely the issues that are slowing your mappings down, rather than database-wide issues that might not be relevant to your mapping.
    For some background information on agile methods, take a look at Kent Graziano's paper, this one on test-driven development http://c2.com/cgi/wiki?TestDrivenDevelopment , this one http://martinfowler.com/articles/evodb.html on agile database development, and the sourceforge project for utPLSQL http://utplsql.sourceforge.net/. What this is all about is having a development methodology that builds in quality but is flexible and responsive to changes in customer requirements. The benefit of using utPLSQL (or any unit testing framework) is that you can automatically check your altered mappings to see that they still return logically correct data, meaning that you can make changes to your data model and mappings whilst still being sure that it'll still compile and run.
    Observations On The Current State of OWB Performance Tuning & Testing
    At present, when you build OWB mappings, there is no way (within the OWB GUI) to determine how "efficient" the mapping is. Often, when building the mapping against development data, the mapping executes quickly and yet when run against the full dataset, problems then occur. The mapping is built "in isolation" from its effect on the database and there is no handy tool for determining how efficient the SQL is.
    OWB doesn't come with any methodology or testing framework, and so apart from checking that the mapping has run, and that the number of rows inserted/updated/deleted looks correct, there is nothing really to tell you whether there are any "logical" errors. Also, there is no OWB methodology for integration testing, unit testing, or any other sort of testing, and we need to put one in place. Note - OWB does come with auditing, error reporting and so on, but there's no framework for guiding the user through a regime of unit testing, integration testing, system testing and so on, which I would imagine more complete developer GUIs come with. Certainly there's no built in ability to use testing frameworks such as utPLSQL, or a part of the application that let's you record whether a mapping has been tested, and changes the test status of mappings when you make changes to ones that they are dependent on.
    OWB is effectively a code generator, and this code runs against the Oracle database just like any other SQL or PL/SQL code. There is a whole world of information and techniques out there for tuning SQL and PL/SQL, and one particular methodology that we quite like is the Cary Millsap/Jeff Holt "Extended SQL Trace" approach that uses Oracle diagnostic events to find out exactly what went on during the running of a batch of SQL commands. We've been pretty successful using this approach to tune customer applications and batch jobs, and we'd like to use this, together with the "Method R" performance profiling methodology detailed in the book "Optimising Oracle Performance", as a way of tuning our generated mapping code.
    Whilst we want to build performance and quality into our code, we also don't want to overburden developers with an unwieldy development approach, because what we'll know will happen is that after a short amount of time, it won't get used. Given that we want this framework to be used for all mappings, it's got to be easy to use, cause minimal overhead, and have results that are easy to interpret. If at all possible, we'd like to use some of the ideas from agile methodologies such as eXtreme Programming, SCRUM and so on to build in quality but minimise paperwork.
    We also recognise that there are quite a few settings that can be changed at a session and instance level, that can have an effect on the performance of a mapping. Some of these include initialisation parameters that can change the amount of memory assigned to the instance and the amount of memory subsequently assigned to caches, sort areas and the like, preferences that can be set so that indexes are preferred over table scans, and other such "tweaks" to the Oracle instance we're working with. For reference, the version of Oracle we're going to use to both run our code and store our data is Oracle 10g 10.1.0.3 Enterprise Edition, running on Sun Solaris 64-bit.
    Some initial thoughts on how this could be accomplished
    - Put in place some method for automatically / easily generating explain plans for OWB mappings (issue - this is only relevant for mappings that are set based, and what about pre- and post- mapping triggers)
    - Put in place a method for starting and stopping an event 10046 extended SQL trace for a mapping
    - Put in place a way of detecting whether the explain plan / cost / timing for a mapping changes significantly
    - Put in place a way of tracing a collection of mappings, i.e. a process flow
    - The way of enabling tracing should either be built in by default, or easily added by the OWB developer. Ideally it should be simple to switch it on or off (perhaps levels of event 10046 tracing?)
    - Perhaps store trace results in a repository? reporting? exception reporting?
    at an instance level, come up with some stock recommendations for instance settings
    - identify the set of instance and session settings that are relevant for ETL jobs, and determine what effect changing them has on the ETL job
    - put in place a regime that records key instance indicators (STATSPACK / ASH) and allows reports to be run / exceptions to be reported
    - Incorporate any existing "performance best practices" for OWB development
    - define a lightweight regime for unit testing (as per agile methodologies) and a way of automating it (utPLSQL?) and of recording the results so we can check the status of dependent mappings easily
    other ideas around testing?
    Suggested Approach
    - For mapping tracing and generation of explain plans, a pre- and post-mapping trigger that turns extended SQL trace on and off, places the trace file in a predetermined spot, formats the trace file and dumps the output to repository tables.
    - For process flows, something that does the same at the start and end of the process. Issue - how might this conflict with mapping level tracing controls?
    - Within the mapping/process flow tracing repository, store the values of historic executions, have an exception report that tells you when a mapping execution time varies by a certain amount
    - get the standard set of preferred initialisation parameters for a DW, use these as the start point for the stock recommendations. Identify which ones have an effect on an ETL job.
    - identify the standard steps Oracle recommends for getting the best performance out of OWB (workstation RAM etc) - see OWB Performance Tips http://www.rittman.net/archives/001031.html and Optimizing Oracle Warehouse Builder Performance http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - Investigate what additional tuning options and advisers are available with 10g
    - Investigate the effect of system statistics & come up with recommendations.
    Further reading / resources:
    - Diagnosing Performance Problems Using Extended Trace" Cary Millsap
    http://otn.oracle.com/oramag/oracle/04-jan/o14tech_perf.html
    - "Performance Tuning With STATSPACK" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-sep/index.html?o50tun.html
    - "Performance Tuning with Statspack, Part II" Connie Dialeris and Graham Wood
    http://otn.oracle.com/deploy/performance/pdf/statspack_tuning_otn_new.pdf
    - "Analyzing a Statspack Report: A Guide to the Detail Pages" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-nov/index.html?o60tun_ol.html
    - "Why Isn't Oracle Using My Index?!" Jonathan Lewis
    http://www.dbazine.com/jlewis12.shtml
    - "Performance Tuning Enhancements in Oracle Database 10g" Oracle-Base.com
    http://www.oracle-base.com/articles/10g/PerformanceTuningEnhancements10g.php
    - Introduction to Method R and Hotsos Profiler (Cary Millsap, free reg. required)
    http://www.hotsos.com/downloads/registered/00000029.pdf
    - Exploring the Oracle Database 10g Wait Interface (Robin Schumacher)
    http://otn.oracle.com/pub/articles/schumacher_10gwait.html
    - Article referencing an OWB forum posting
    http://www.rittman.net/archives/001031.html
    - How do I inspect error logs in Warehouse Builder? - OWB Exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case10.pdf
    - What is the fastest way to load data from files? - OWB exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case1.pdf
    - Optimizing Oracle Warehouse Builder Performance - Oracle White Paper
    http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - OWB Advanced ETL topics - including sections on operating modes, partition exchange loading
    http://www.oracle.com/technology/products/warehouse/selfserv_edu/advanced_ETL.html
    - Niall Litchfield's Simple Profiler (a creative commons-licensed trace file profiler, based on Oracle Trace Analyzer, that displays the response time profile through HTMLDB. Perhaps could be used as the basis for the repository/reporting part of the project)
    http://www.niall.litchfield.dial.pipex.com/SimpleProfiler/SimpleProfiler.html
    - Welcome to the utPLSQL Project - a PL/SQL unit testing framework by Steven Feuernstein. Could be useful for automating the process of unit testing mappings.
    http://utplsql.sourceforge.net/
    Relevant postings from the OTN OWB Forum
    - Bulk Insert - Configuration Settings in OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=291269&tstart=30&trange=15
    - Default Performance Parameters
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=213265&message=588419&q=706572666f726d616e6365#588419
    - Performance Improvements
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270350&message=820365&q=706572666f726d616e6365#820365
    - Map Operator performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=238184&message=681817&q=706572666f726d616e6365#681817
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Poor mapping performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=275059&message=838812&q=706572666f726d616e6365#838812
    - Optimizing Mapping Performance With OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=269552&message=815295&q=706572666f726d616e6365#815295
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Performance of the OWB-Repository
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=66271&message=66271&q=706572666f726d616e6365#66271
    - One large JOIN or many small ones?
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=202784&message=553503&q=706572666f726d616e6365#553503
    - NATIVE PL SQL with OWB9i
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270273&message=818390&q=706572666f726d616e6365#818390
    Next Steps
    Although this is something that I'll be progressing with anyway, I'd appreciate any comment from existing OWB users as to how they currently perform performance tuning and testing. Whilst these are perhaps two distinct subject areas, they can be thought of as the core of an "OWB Best Practices" framework and I'd be prepared to write the results up as a freely downloadable whitepaper. With this in mind, does anyone have an existing best practices for tuning or testing, have they tried using SQL trace and TKPROF to profile mappings and process flows, or have you used a unit testing framework such as utPLSQL to automatically test the set of mappings that make up your project?
    Any feedback, add it to this forum posting or send directly through to me at [email protected]. I'll report back on a proposed approach in due course.

    Hi Mark,
    interesting post, but I think you may be focusing on the trees, and losing sight of the forest.
    Coincidentally, I've been giving quite a lot of thought lately to some aspects of your post. They relate to some new stuff I'm doing. Maybe I'll be able to answer in more detail later, but I do have a few preliminary thoughts.
    1. 'How efficient is the generated code' is a perennial topic. There are still some people who believe that a code generator like OWB cannot be in the same league as hand-crafted SQL. I answered that question quite definitely: "We carefully timed execution of full-size runs of both the original code and the OWB versions. Take it from me, the code that OWB generates is every bit as fast as the very best hand-crafted and fully tuned code that an expert programmer can produce."
    The link is http://www.donnapkelly.pwp.blueyonder.co.uk/generated_code.htm
    That said, it still behooves the developer to have a solid understanding of what the generated code will actually do, such as how it will take advantage of indexes, and so on. If not, the developer can create such monstrosities as lookups into an un-indexed field (I've seen that).
    2. The real issue is not how fast any particular generated mapping runs, but whether or not the system as a whole is fit for purpose. Most often, that means: does it fit within its batch update window? My technique is to dump the process flow into Microsoft Project, and then to add the timings for each process. That creates a Critical Path, and then I can visually inspect it for any bottleneck processes. I usually find that there are not more than one or two dogs. I'll concentrate on those, fix them, and re-do the flow timings. I would add this: the dogs I have seen, I have invariably replaced. They were just garbage, They did not need tuning at all - just scrapping.
    Gee, but this whole thing is minimum effort and real fast! I generally figure that it takes maybe a day or two (max) to soup up system performance to the point where it whizzes.
    Fact is, I don't really care whether there are a lot of sub-optimal processes. All I really care about is performance of the system as a whole. This technique seems to work for me. 'Course, it depends on architecting the thing properly in the first place. Otherwise, no amount of tuning of going to help worth a darn.
    Conversely (re. my note about replacing dogs) I do not think I have ever tuned a piece of OWB-generated code. Never found a need to. Not once. Not ever.
    That's not to say I do not recognise the value of playing with deployment configuration parameters. Obviously, I set auditing=none, and operating mode=set based, and sometimes, I play with a couple of different target environments to fool around with partitioning, for example. Nonetheless, if it is not a switch or a knob inside OWB, I do not touch it. This is in line with my dictat that you shall use no other tool than OWB to develop data warehouses. (And that includes all documentation!). (OK, I'll accept MS Project)
    Finally, you raise the concept of a 'testing framework'. This is a major part of what I am working on at the moment. This is a tough one. Clearly, the developer must unit test each mapping in a design-model-deploy-execute cycle, paying attention to both functionality and performance. When the developer is satisifed, that mapping will be marked as 'done' in the project workbook. Mappings will form part of a stream, executed as a process flow. Each process flow will usually terminate in a dimension, a fact, or an aggregate. Each process flow will be tested as an integrated whole. There will be test strategies devised, and test cases constructed. There will finally be system tests, to verify the validity of the system as a production-grade whole. (stuff like recovery/restart, late-arriving data, and so on)
    For me, I use EDM (TM). That's the methodology I created (and trademarked) twenty years ago: Evolutionary Development Methodology (TM). This is a spiral methodology based around prototyping cycles within Stage cycles within Release cycles. For OWB, a Stage would consist (say) of a Dimensional update. What I am trying to now is to graft this within a traditional waterfall methodology, and I am having the same difficulties I had when I tried to do it then.
    All suggestions on how to do that grafting gratefully received!
    To sum up, I 'm kinda at a loss as to why you want to go deep into OWB-generated code performance stuff. Jeepers, architect the thing right, and the code runs fast enough for anyone. I've worked on ultra-large OWB systems, including validating the largest data warehouse in the UK. I've never found any value in 'tuning' the code. What I'd like you to comment on is this: what will it buy you?
    Cheers,
    Donna
    http://www.donnapkelly.pwp.blueyonder.co.uk

  • Strange issue in Report Builder

    Hi All,
    I am facing a strange issue in Report Builder.
    I am having a report which is currently in production. when i am opening that report and just saving it without making any changes, then also the size of the report is reducing from 238 to 204 kb.
    When i am executing this report in production then it is giving me the error "Invalid Body size"
    Please hlep me or else have to ask God to save me.
    Thanks
    Aryan

    Hi Hussien,
    I tried thart solution but it not worked. i put some debug statement in Before Parameter form, after Parameter form, before report triggers. When i am running that report i am getting the debug statement of Before and After Parameter triggers but it is failing somewhere in Before Report Trigger.
    I removed everything from Before Report Trigger and still the result is same "Invalid Body Size".
    Please Hussien give me a some pointer which stage is processed between After Parameter Form trigger and Before Report Trigger.
    Thanks
    Aryan

  • Issue with Template Builder

    Hello All,
    I am having issue with template builder after loading the XML data when trying to insert a field I am getting the following error
    Run-Time error '6';
    Over flow
    Where as my collegue is not getting the error. The differnce what I observe is In menu options
    for me it is showing as
    tempalte builder-->Insert-->Table/form
    where as for him it is showing as
    tempalte builder-->Insert-->Table/form ->Wizard (some thing like this)
    Is this the version differnce? please let me know
    Kind Regards,
    Kumar.

    can you get the latest version ?
    http://www.oracle.com/technology/software/products/publishing/index.html

  • New User - What's the difference Portal Builder and PDK Developer Kit?

    I am a bit confused. I have been working with Portal Builder creating some pages. I have currently run across the topic of PDK Developer Kit. How is this different than Portal Builder or is it? Is the PDK download for personal PCs or for the server? What does PDK provide that Builder doesn't?
    Thanks.

    Hi,
    The Oracle Portal Developer Kit (PDK) is a very important & useful component of Oracle Portal. It helps developers ( like me ) to easily convert existing applications into powerful Portlets. It also helps us to get some "context" information from the Portal & take decisions based on it.
    You can comfortably use the Portal Builder with the supplied Portlets & develop business solutions - however, at some point, you will notice that you will be working within a confined box. If you need to think "outside-the-box", you will defenitely need to have a look at PDK.
    I have shown you the tip of the Iceberg - I would encourage you to download & explore the powerful capabilites of Oracle PDK.
    Regards,
    Sandeep

  • Issue With Report Builder After Installing SP3 for SQL 2008 R2

    Hello.  We are experiencing an issue with Report Builder 3.0 since installing SP3 for SQL 2008 R2 over the weekend.  You can no longer launch Report Builder from the Reporting Services URL (http://dicomweb/ReportServer/ReportBuilder/ReportBuilder_3_0_0_0.application
          Server  : Microsoft-HTTPAPI/2.0
          X-AspNet-Version: 2.0.50727
     Application url   :
    http://dicomweb/ReportServer/ReportBuilder/RptBuilder_3/MSReportBuilder.exe.manifest
          Server  : Microsoft-HTTPAPI/2.0
          X-AspNet-Version: 2.0.50727
    IDENTITIES
     Deployment Identity  : ReportBuilder_3_0_0_0.application, Version=10.50.6000.34, Culture=neutral, PublicKeyToken=c3bce3770c238a49, processorArchitecture=x86
     Application Identity  : MSReportBuilder.exe, Version=10.50.6000.34, Culture=neutral, PublicKeyToken=c3bce3770c238a49, processorArchitecture=x86, type=win32
    APPLICATION SUMMARY
     * Online only application.
     * Trust url parameter is set.
    ERROR SUMMARY
     Below is a summary of the errors, details of these errors are listed later in the log.
     * Activation of
    http://dicomweb/ReportServer/ReportBuilder/ReportBuilder_3_0_0_0.application resulted in exception. Following failure messages were detected:
      + File, Microsoft.ReportingServices.ComponentLibrary.Controls.dll, has a different computed hash than specified in manifest.
    COMPONENT STORE TRANSACTION FAILURE SUMMARY
     No transaction error was detected.
    WARNINGS
     There were no warnings during this operation.
    OPERATION PROGRESS STATUS
     * [10/15/2014 9:35:56 AM] : Activation of
    http://dicomweb/ReportServer/ReportBuilder/ReportBuilder_3_0_0_0.application has started.
     * [10/15/2014 9:36:29 AM] : Processing of deployment manifest has successfully completed.
     * [10/15/2014 9:36:29 AM] : Installation of the application has started.
     * [10/15/2014 9:36:31 AM] : Processing of application manifest has successfully completed.
     * [10/15/2014 9:36:35 AM] : Found compatible runtime version 2.0.50727.
     * [10/15/2014 9:36:35 AM] : Detecting dependent assembly Sentinel.v3.5Client, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=msil using Sentinel.v3.5Client, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a,
    processorArchitecture=msil.
     * [10/15/2014 9:36:35 AM] : Detecting dependent assembly System.Data.Entity, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089, processorArchitecture=msil using System.Data.Entity, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089,
    processorArchitecture=msil.
     * [10/15/2014 9:36:35 AM] : Request of trust and detection of platform is complete.
    ERROR DETAILS
     Following errors were detected during this operation.
     * [10/15/2014 9:36:43 AM] System.Deployment.Application.InvalidDeploymentException (HashValidation)
      - File, Microsoft.ReportingServices.ComponentLibrary.Controls.dll, has a different computed hash than specified in manifest.
      - Source: System.Deployment
      - Stack trace:
       at System.Deployment.Application.ComponentVerifier.VerifyFileHash(String filePath, Hash hash)
       at System.Deployment.Application.ComponentVerifier.VerifyFileHash(String filePath, HashCollection hashCollection)
       at System.Deployment.Application.ComponentVerifier.VerifyComponents()
       at System.Deployment.Application.DownloadManager.DownloadDependencies(SubscriptionState subState, AssemblyManifest deployManifest, AssemblyManifest appManifest, Uri sourceUriBase, String targetDirectory, String group, IDownloadNotification
    notification, DownloadOptions options)
       at System.Deployment.Application.ApplicationActivator.DownloadApplication(SubscriptionState subState, ActivationDescription actDesc, Int64 transactionId, TempDirectory& downloadTemp)
       at System.Deployment.Application.ApplicationActivator.InstallApplication(SubscriptionState& subState, ActivationDescription actDesc)
       at System.Deployment.Application.ApplicationActivator.PerformDeploymentActivation(Uri activationUri, Boolean isShortcut, String textualSubId, String deploymentProviderUrlFromExtension, BrowserSettings browserSettings, String& errorPageUrl)
       at System.Deployment.Application.ApplicationActivator.ActivateDeploymentWorker(Object state)
    COMPONENT STORE TRANSACTION DETAILS
     No transaction information is available.

    Hi DMcGarveyt,
    This is an known issue in SSRS 2008 R2 SP3. For this issue, Microsoft has published an article addressing this issue. It is not planning to release a fix for this defect. But there's workarounds for this issue. Please refer to the link below:
    Report Builder of SQL Server 2008 R2 Service Pack 3 does not launch.
    If you have any question, please feel free to ask.
    Best Regards,
    Simon Hou

  • Just some thoughts

    Just some thoughts.
    Adobe says that piracy is one of the reasons for going to the cloud.  CC has to verify the subscription periodically why not add that process to the perpetual license for a nominal yearly fee.  On the other hand, why can't a user subscribe to CC for a month to get the latest version and then go on the web to get a crack?
    I also own, unless they go to a subscription, Elements 10.  Lots of functions that CS6 has for photo enhancements.  I found some plugins that really enhance Elements. e.g. ElementsXXL adds new menu items, icons, buttons, key shortcuts and dialogs, so they seamlessly integrate into the user interface of Photoshop Elements. ElementsXXL bridges the gap between Photoshop Elements and Photoshop and greatly enhances the image editing experience in Photoshop Elements.
    I am sure other plugins will appear to make Elements more like Photoshop.

    "Adobe says that piracy is one of the reasons for going to the cloud. CC has to verify the subscription periodically why not add that process to the perpetual license for a nominal yearly fee."
    Why on earth would anyone want to pay a "nominal yearly fee" for another nuisance created by Adobe.  They can already shut down perpetual license holder's software if they pull the plug on the activation servers.  Don't you know that their servers check our software now as it is?
    Just recently with the release of the CS2 software to previous owners, many people could no longer use their already-installed CS2 software because Adobe's CS2 servers were shut down.  What does that tell you?
    I'm not going the way of cruising the internet for cracks.  That's for disillusioned kids who think everything should be free.  When my CS6 software no longer works I'm going with another company.  Probably Corel for some of the programs.  CorelDRAW was always better than Illustrator anyway.  Industry standard doesn't always mean it's great.  It just means everyone is using it and they expect you to do as well.  Personally, I think Macromedia made better software before Adobe took them over and MM software was a lot cheaper.

  • Okay, I have read some of the issues here

    Okay, I have read some of the issues here and I am scratching my head. I have my PC hard wired to a Linksys WRT G router and my Mac is connected wirelessly to it just fine. Now, my computers don't seem to see each other on the network. Pretend I am a a complete novice and step me through telling me how to network to share the CD files, documents and pics that I have on my PC with my MAC.
    I have the sharing enabled on both the MAC and the PC and I had this working at one point but something went haywire and all was lost. I have since reset my router and re-established the wireless connection between the PC and MAC.
    When I click on the Finder and click on Network, my PC is no longer in there like it was before.
    Any help would be appreciated.
    Thank,
    Dave

    I'm just getting setup myself. Is your mac sensing the wireless network? And if so can you surf and check your email on the mac wirelessly?

  • Portlet disappeared from Portal Builder after editing

    I created a Master-Detail form in portal builder. At the end of one edit, I clicked cancel to undo the change. Somehow, the portlet was totally disappeared. I checked the repository table WWA_FORM$ and WWA_MODULES$, this form is still there, but it just does not show up in the portal builder navigator page. Can anybody help me to explain this abnormal behavior and how to get the form back to the builder?
    Thank you for your help.

    I did more research and found out that the form module that was missing is not in the table WWV_modules$. I am wondering why Oracle portal created such an inconsistancy. The module is still in wwa_modules$ table, but not in wwv_modules$ table. How did the record get deleted from wwv_modules$ table, but not wwa_modules$? I am the only user of this thing and I never delete this module. All I am doing is editing. Anybody has any ideas?

  • Sorry, we are having some temporary server issues. You can work off line if you plan to insert pictures from your computer

    We have about 200 users that connect to three terminal servers. On the servers we have Office 2013 installed. Several users are stating that when they attempt to search for templates or insert Online Pictures, they are getting error messages of "Sorry,
    we are having some temporary server issues. You can work off line if you plan to insert pictures from your computer"
    Remember, they are connecting to a terminal server, all have roaming profiles, and for some users this works fine.
    We've Googled the error message and found no help. A lot of responses was to reset the IE settings, which we've tried to no avail.
    Also as a test, we took one of the users that was having the issue and deleted their profile. When they logged back on the server, the same problem occurred again.
    We're currently at a loss as to why it works for some users and not others?
    Any ideas out there?

    You can refer to this link and find the possible solution which is to delete the offending registry key:
    HKCU\Software\Microsoft\Office\15.0\Common\Internet\WebServiceCache\AllUsers\office15client.microsoft.com
    http://angrytechnician.wordpress.com/2013/05/15/office-2013-error-sorry-we-are-having-some-temporary-server-issues/
    There we can also find information about deploying logon script to all roaming profile users to resolve this issue.

  • Please tell me some real time issues faced by u in SCRIPTS

    please tell me some real time issues faced by u in ur experience

    First understand SAP scripts are client dependent..changes automatically not reflects in all development client once it is change in once client.
    We mostly see alignment issue in SAP script it is mostly because of printer settings etc...
    Also we may have few problems when printing unicode characters.. for printing double digits characters... like Japanese, chinese, korean etc.. we should have a printer with unicode enabled..
    Regards,
    SaiRam

  • How do I access the portal "builder"?

    It seems like I am successfully logging in as the orcladmin user via the Login link off of the portal home page (/pls/portal/portal.home), because there is no error upon logging in and I can access the OID admin pages. However, I can't seem to figure out how I get to the pages that will allow me to build portal pages, etc. All I see is the "Home", "Community" , "Refresh","Login" and "Help" buttons on the top navigator. Is there a url that I can type in to access the portal builder pages?
    I am quite certain that I am logging in properly because after my initial login, when I click the Login link, it does not take me to the login page, just leaves me where I am at.
    Upon my initial login, should'nt I be forwarded to the portal admin page or something??
    Any help would be appreciated.

    once you login, click on the "corporate documents" tab.
    a link to the "builder" should show up in the top right.
    clicking on that gets you what you want. (there is then a link to navigator on the top right of that page...if you want to work on pages).

  • Hi All, can i have some production support issues with rootcasue and resolution for SAP TM?

    Hi All, can i have some production support issues with rootcasue and resolution for SAP TM?
    Thanks,
    Sreenivas

    Hi Sreenivas,
    I would recommend that you read the Rules of Engagement and other documents in the Getting Started link (top right) before posting anymore.  Your Discussion will most likely get reported as non-specific and get removed.  If you have a specific problem with TM, please post it in a new thread with error messages, version and SPs installed, and how the error occurs and what you are trying to get TM to do.
    There are a lot of resources available in the TM Overview page which can help, so start there and maybe also look at some of the MKS (Monday Knowledge Session) recordings which should also be listed.  There are also a lot of experienced people who can help resolve issues your TM installation, but you need to provide enough information on the problems you are having.  If you are just looking for information on past problems, do a Search or simply browse through past Discussions which are marked with a green Check (Correct Answer).
    Regards, Mike
    SAP Customer Experience Group - CEG (and a Moderator)

  • Unable to bootup after Yosemite install. have to keep shutting down from rear button. it does come on after a few times but then hangs at user login again. If you get in it is great. Seems faster than Mavericks but there is some sort of issue, bouts

    unable to bootup after Yosemite install. have to keep shutting down from rear button. it does come on after a few times but then hangs at user login again. If you get in it is great. Seems faster than Mavericks but there is some sort of issue, bootup /login.

    Knock on wood, this seems to have been my problem as well.  I stumbled on this thread after dealing with this ridiculously-long boot times for the past several weeks.
    I just reinstalled McAfee Antivirus and my Macbook Air booted up in less than 1 minute.  No more hanging on the boot-up progress bar.  No more hanging after I click on my user's avatar on the log-in screen.  Bootup would often take 5+ minutes and sometimes never complete.
    This has been SUPREMELY frustrating.
    THANK YOU SO MUCH FOR POSTING YOUR RESPONSE!!!

  • So my airport extreme recently had some nat/dns issue and in the airport utility displayed a warning about it and to correct it. I wasn't sure what to do so i pressed the resolve icon and now my guest network is not working.

    So my airport extreme recently had some nat/dns issue and in the airport utility displayed a warning about it and to correct it. I wasn't sure what to do so i pressed the resolve icon and now my guest network is not working.

    Anytime you change networking hardware, it is always a good idea to perform a complete power recycle of your networking components.
    I would recommend that you do the following as a minimum:
    Power-down the modem, AirPort base station, and computer(s).
    Disconnect the AirPort base station from the Internet broadband modem.
    While all of the devices are powered-down, perform a "factory default" reset on the base station. This will get it back to its "out-of-the-box" configuration and make setting it up much easier, especially if you use the "Assist me" process within the AirPort Utility. (ref: Resetting an AirPort Base Station or Time Capsule)
    After the base station resets, go ahead and power it back down.
    Reconnect the AirPort base station to the Internet broadband modem. For the Extreme and Time Capsule, be sure to connect the cable to the base station's WAN (circle-of-dots) port.
    Power-up the modem; wait at least 10-15 minutes to allow it adequate time to initialize.
    Power-up the AirPort base station; wait at least 5-10 minutes. Note: The AirPort's status light may continue to flash amber after it has intialized. That is because, there may be some additional configuration items necessary, like setting up wireless security, before the overall setup is completed to get a green status.
    Power-up your computer(s).
    In this basic configuration, the AirPort base station will broadcast an unsecured wireless network with a Network Name (SSID) of Apple Network NNNNNN. Network clients, connected to the base station either by wire or wireless, should now be able to access the Internet through the ISP's modem. Once Internet connectivity has been verified, you can use the AirPort Utility to configure the base station for wireless security and any other desired options. Please post back your results.

Maybe you are looking for