HFM Data Extract in Task Flow ??

HI,
I have a couple of clarification on HFM Data extract in HFM Task flow,
1) Is Data extraction possible for the predefined accounts in HFM task flow? – in which Accounts & other dimesion memebers needs to be pre-defined so that just on execution of the task flow, HFM needs to extract the data into a text file (i.e., It should not prompt to select the Accounts (as well as other dimension members) each time we extract)
2) For Data extract purpose, we need some of the HFM members in a different name. So, we planned to create a alias table. In data extraction, Is it possible to select the member names from an custom alias table?
3) We have to define a specific set of Account values as negative in the extracted data file which is actually positive in HFM. Also, In the extracted data file, We need to have the Account member name same as the existing HFM Account name.
a.     For instance,
In HFM:
Account Member Name in HFM: VUK
Account Value: 1000
In the Extracted data file:
Account Member Name in data file: VUK
Account Value: -1000
Is this feasible? If yes, can you explain a bit in detail (like how we can implement this in HFM)
Thanks,
Siva

HI,
I need a help in HFM Data extraction.
We use four custom dimensions and our requirement is to get the data extract for the custom members which we need(we need only parent members in custom dimensions).
But, by default, HFM extracts the data for all base custom members(and not for any parent custom members) - which we dont wish to have.
Is there any possiblty to extract the data by choosing the custom members for which we need data ??
Our's is a classic HFM Application in 9.3.1. (in Oracle DB)
Your response will be highly appreciated !!!
Thanks,
Siva

Similar Messages

  • HFM data extract for Multiload

    Very new to this and need some help please. When extracting HFM data for FDM multiload what is the best way to get the data for periods into columns and not rows without formatting the data manually?

    The data extract interface is limited and does not allow you to have multicolumns for the periods. What you can try is to create a smartview (if the number of rows is not too large) and the save the file as a CSV, then in a text editor replace all commas by semicolons. Otherwise, if you can write Excel VBA code then this could also be an alternative again number of lines could be a potential issue.
    If you don't have to run it through translation tables, you can probably accomplish must of the changes using a strong text editor such as K-edit.

  • Is this possible to display standard JSR-286 data in ADF Task Flow WCP Page

    Below is my scenario,
    I have one Webcenter page. Portlet-A (standard JSR-286 based portlet) consumed from WSRP Registration process and Taskflow-B ( with Fragments ) consumed as region from ADF library jar.
    On the submit of an Drop down menu/submit button in Portlet-A , I want display selected data in Taskflow-B.
    Please help me out.

    Hi Yannick,
    Thanks a lot for the information. It worked.
    The portlet data can be accessible using bindings, but parameter name can be different.
    Meanwhile I have got one more scenario, where the Portlet and Task Flow placed in different pages of WCP Application. On change of data in the Portlet the application should navigate to another page where the Task Flow placed and displays selected data.
    Basically I can not use any button for navigation. The navigation should happen once I do some action in Portlet.
    Is this possible? If yes can you please let me know the steps?
    Thanks in advance!
    Somnath
    Edited by: Somnath Basak on Dec 20, 2011 9:41 AM

  • HFM data extract via Web

    We are using 11.1.2.1 on windows 2008 R2 servers.
    Is there in HFM anything that limits the file size when you're doing a metadata or data export? Maybe not a file size limitation, but may a time limitation or something? I know when I tried to do an export of the application via the HFM client, I got the entire file. When I did it via the web, I could tell I got about half of the file (based on the file size) and I think it gave me the last half of the file since it didn't start with the correct file tags. I typically don't do any of my extracts via the web (I do them through the client), but many of our users do. If there is some kind of limitation within Hyperion do you think we could increase it? We have a user who didn't get a complete file via the web.

    Not sure if this is your problem or not, but when we upgraded to 11.1.2.1.103 I had a similar issue where I would only get 1/2 my metadata extract. Maybe try this and see if it helps:
    Metadata/Data Exports
    You will need to set the AspResponseBufferLimit by performing the following steps.
    By default, the value for the ASPBufferLimit property in IIS 6 and for the bufferLimit property in IIS 7 is 4,194,304 bytes (4 MB).
    The recommendation is to set the value to 1073741824 (1 GB).
    To increase the buffering limit in IIS 6, follow these steps:
    1. Click Start, click Run, type cmd, and then click OK.
    2. Type the following command, and then press ENTER:
    cd /d %systemdrive%\inetpub\adminscripts
    3. Type the following command, and then press ENTER:
    cscript.exe adsutil.vbs SET w3svc/aspbufferinglimit LimitSize
    Note: LimitSize represents the buffering limit size in bytes. For example, the number 1073741824 sets the buffering limit size to 1 GB.
    To increase the buffering limit in IIS 7, follow these steps:
    1. Click Start, click Run, type inetmgr, and then click OK.
    2. In the left hand panel, expand the computer name by clicking the plus sign next to it.
    3. Expand the “Sites”.
    4. Expand “Default Web Site”
    5. Select the HFM web site in the expanded list.
    6. In the middle panel “Features View”, in the IIS section, double click the ASP icon.
    7. Expand “Limits Properties”.
    8. Change “Response Buffering Limit” to 1073741824.
    9. In the right hand panel, select “Apply”.
    10. Restart IIS.

  • Custom Dimension sql names in version 11.1.2 HFM Data Extract

    Does someone know how to reference the Custom Dimensions in Hyperion 11.1.2 in Hyperion 9.3 they were custom1_item, Custom2_item,custom3_item, and custom4_item but I am not sure what they are now called? any help would be appreciated.

    I'm not clear on what you're after, but I think your question is about the item tables in HFM's database? If so, the custom dimension table is now appname_custom_item for all of the custom dimensions. Each custom is then distinguished by LDIMID field, which contains the custom dimension's number "1", "2", "3", "4". I've not examined the same for 11.1.2.2 for an application having more than four dimensions.
    --Chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Task Flow Return Listener not fire when FK association fields set manually

    Guys and Gals,
    I've spent two solid days on this and I'm not sure why my task flow return listener is not firing.
    I start by selecting a row in a table. I then click a "Convert" button which converts the Quote document into a Sales document. I then press Submit which commits the data and the task flow exits. At this point my task flow return listener should fire. It does not. This return listener would, in theory, refresh the visible Quotes table and update the selected Quote's status to "Closed".
    The Quote's "Closed" status is a transient attribute which is calculated by looking at the Sales' document Qty attribute. If the Quote Qty = Sales Qty, then the status is closed. This can be measured by utilizing an association where
    Sales' BaseRefDocId = Quotes' OrderId
    Sales' BaseRefRowId = Quotes' RowId
    Setting these two row attributes represent the association linking a Quote document row to a Sales document row.
            nvp.setAttribute("BaseRefDocId", baseRow.getAttribute("OrderId"));     // Take the Quote Id and put in the Sales' Id ref field
            nvp.setAttribute("BaseRefRowId", baseRow.getAttribute("Row_Id"));    // Take the Quote Row Id and put it in the Sales Row Id ref field
            targetRow.createAndInitRow(nvp);                                                    // Insert the new referenced row into the Sales' DocumentAfter two days of running tests, it is the code above that keeps the return listener from firing and the transient attribute from refreshing on the page. These fields are not mandatory, but are necessary for the Quotes status to change to closed. Simply leaving these lines of code out allows my task flow return listener to refresh correctly, albeit with an incorrect Quote status.
    My expression language statements, however, evaluate correctly irregardless of table refresh. If I refresh the table manually, the status will then display the correct value. All other methods of manipulating the table function correctly i.e. task flow return statements work.
    I'm pretty sure it has something to do with some kind of silent association / view link error blocking the task flow from firing behind the scenes.
    Does anyone have any ideas? Using JDev 11.1.2.1.0.
    Will

    Hi Frank,
    Yeah, I thought it was really weird as well. I banged my head up against the wall again today and finally managed to semi-fix the problem.
    The "Convert" table toolbar button has a "Disabled" attribute that I've been setting with something like #{bindings.QuoteIterator.currentRow == null}. If I take this out, everything works fine. However, if I put it in, the task flow will not return. What's screwy is that I have several of these "Quote" tables for other data collections such as Sales, Deliveries, Invoices, etc. About half of them fire a task flow return with the "Disabled" attribute set for the convert button, and the other half don't. They all return a task flow return if I just set "Disabled" to false.
    At three days and counting, this is really an issue I just don't get, and I'm not sure if I could reproduce the problem to submit it to support because everything appears just fine and I've been digging for days.

  • Application Module instance not found in task flow

    Hi,
    i am working on Oracle Jdeveloper 11g Release 1
    I have created a bounded task flow for with following properties:
    <transaction id="__14">
          <new-transaction/>
        </transaction>
        <data-control-scope id="__15">
          <shared/>
        </data-control-scope>
        <task-flow-reentry id="__13">
          <reentry-not-allowed/>
        </task-flow-reentry>I have declared a page flow scope bean in the task flow as:
    <managed-bean id="__5">
          <managed-bean-name id="__6">trainBean</managed-bean-name>
          <managed-bean-class id="__8">oracle.sysman.core.gccompliance.view.library.rule.patchRule.PatchRuleTrainBean</managed-bean-class>
          <managed-bean-scope id="__7">pageFlow</managed-bean-scope>
        </managed-bean>but when i am trying to access the AM Impl instance from bean methos as below:
    public static final String DATA_CONTROLLER = "ComplianceLibraryAMDataControl";
        public ComplianceLibraryAMImpl getDataControl() {
            DCBindingContainer bc =
                (DCBindingContainer)BindingContext.getCurrent().getCurrentBindingsEntry();
            ApplicationModule am = bc.findDataControl(DATA_CONTROLLER).getApplicationModule();
            return (ComplianceLibraryAMImpl)am;
        }I am getting null pointer exception in ApplicationModule am = bc.findDataControl(DATA_CONTROLLER).getApplicationModule();
    I am using the same way to get bean in other task flows as well , so i think the code to get bean is working correctly.
    What am i missing in this?

    thanks Puthanampatti ,
    This is working and i am able to get AM instance from this , But i am not sure what is the difference between:
    public ComplianceLibraryAMImpl getComplianceLibraryAM() {
    ComplianceLibraryAMImpl am = (ComplianceLibraryAMImpl)ADFUtils.getApplicationModuleForDataControl("ComplianceLibraryAMDataControl");
    return am;
    Referred methods:
         * Get application module for an application module data control by name.
         * @param name application module data control name
         * @return ApplicationModule
        public static ApplicationModule getApplicationModuleForDataControl(String name)
            return (ApplicationModule) JSFUtils.resolveExpression("#{data." +  name +  ".dataProvider}");
         * Method for taking a reference to a JSF binding expression and returning
         * the matching object (or creating it).
         * @param expression EL expression
         * @return Managed object
        public static Object resolveExpression(String expression) {
            FacesContext facesContext = getFacesContext();
            Application app = facesContext.getApplication();
            ExpressionFactory elFactory = app.getExpressionFactory();
            ELContext elContext = facesContext.getELContext();
            ValueExpression valueExp =
                elFactory.createValueExpression(elContext, expression,
                                                Object.class);
            return valueExp.getValue(elContext);
        }And my previous approach:
        public ComplianceLibraryAMImpl getDataControl() {
                     DCBindingContainer bc = (DCBindingContainer)BindingContext.getCurrent().getCurrentBindingsEntry();
                     ApplicationModule am = bc.findDataControl(DATA_CONTROLLER).getApplicationModule();
                     return (ComplianceLibraryAMImpl)am;
        }Why am i not able to find my data control AM from Binding Container????

  • View object sharing between task flows

    Hi,
    I two view objects one should not share the data between bounded task flows and another view object should share the data between the two task flows.
    view's can attach diffrent application module also. how can i configure the AM properties or task flow properties to achieve the same.
    Reg,
    Brahma B.

    thanks Frank,
    My shared view object may be updatable. if am in one task flow there i will modify one column in VO then navagate to other task flow, there based on previous modified column i will change another column data and send to entity to update that data to DB. in case of view updation i didn't require any transaction change also like making dirty and undirty.
    you mean i need to set the AM in Application scope or session scope in project properties correct? even if am setting the AM in session or application scope when am running the application for diffrent task flows with begin New transaction my preparesession method at AM level is getting invoked for every task flow so the transaction is new correct? so data also not getting shared between task flows even am giving the session or application scope for AM.
    Reg,
    Brahma B.
    Edited by: BRM on Nov 28, 2011 3:26 AM

  • Execute a Task flow in HFM based on a event

    Hi All,
    I have a requirement to automate a task flow based on if a file has arrived at a particular location. The requirement goes like this:
    1. File arrives at a particular location
    2. Task flow will be initiated as soon as the file arrives, task flow will load data from the file,consolidate and extract data to a predefined location.
    My question is how can i write a batch to check in the folder if the file has arrived and based on that trigger the task flow if i get a positive result.
    I can write the batch to check if the file exist but i don't know how to trigger the task flow from batch script.
    Any help guys, I'm badly stuck.
    Thanks in advance,
    Suman

    Make a file / directory watcher service to monitor for the activity : http://www.techrepublic.com/article/use-the-net-filesystemwatcher-object-to-monitor-directory-changes-in-c/6165137
    Then use hfm-batch to execute your script. https://github.com/cgiogkarakis/hfm-batch

  • Analysing Task Audit, Data Audit and Process Flow History

    Hi,
    Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
    We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
    Many Thanks.

    I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
    This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
    I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
    For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
    The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
    -- Declare working variables --
    declare @dtStartDate as nvarchar(20)
    declare @dtEndDate as nvarchar(20)
    declare @strAppName as nvarchar(20)
    declare @strSQL as nvarchar(4000)
    -- Initialize working variables --
    set @dtStartDate = '1/1/2012'
    set @dtEndDate = '8/31/2012'
    set @strAppName = 'YourAppNameHere'
    --Get Rules Load, Metadata, Member List
    set @strSQL = '
    select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (1)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (21)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (23)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
    exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
    ActivityID     ActivityName
    0     Idle
    1     Rules Load
    2     Rules Scan
    3     Rules Extract
    4     Consolidation
    5     Chart Logic
    6     Translation
    7     Custom Logic
    8     Allocate
    9     Data Load
    10     Data Extract
    11     Data Extract via HAL
    12     Data Entry
    13     Data Retrieval
    14     Data Clear
    15     Data Copy
    16     Journal Entry
    17     Journal Retrieval
    18     Journal Posting
    19     Journal Unposting
    20     Journal Template Entry
    21     Metadata Load
    22     Metadata Extract
    23     Member List Load
    24     Member List Scan
    25     Member List Extract
    26     Security Load
    27     Security Scan
    28     Security Extract
    29     Logon
    30     Logon Failure
    31     Logoff
    32     External
    33     Metadata Scan
    34     Data Scan
    35     Extended Analytics Export
    36     Extended Analytics Schema Delete
    37     Transactions Load
    38     Transactions Extract
    39     Document Attachments
    40     Document Detachments
    41     Create Transactions
    42     Edit Transactions
    43     Delete Transactions
    44     Post Transactions
    45     Unpost Transactions
    46     Delete Invalid Records
    47     Data Audit Purged
    48     Task Audit Purged
    49     Post All Transactions
    50     Unpost All Transactions
    51     Delete All Transactions
    52     Unmatch All Transactions
    53     Auto Match by ID
    54     Auto Match by Account
    55     Intercompany Matching Report by ID
    56     Intercompany Matching Report by Acct
    57     Intercompany Transaction Report
    58     Manual Match
    59     Unmatch Selected
    60     Manage IC Periods
    61     Lock/Unlock IC Entities
    62     Manage IC Reason Codes
    63     Null

  • HFM Consolidations using Task Flow Automation

    I have created a task flow that has 5 stages (each stage having a success and a fail link). The first stage is set to consolidate (Impacted) our Actuals scenario for 6 years for 3 entity trees). The 2nd stage consolidates a different scenario for 3 yrs for 3 trees, etc. The task flow seems to "stop" in Stage 1 after consolidating all 3 trees for a few years. If I look in the View Taskflow status, the task flow still shows as being active; however if I look under Running Tasks, it shows nothing is running. If I look under Task Audit, it shows the last consolidation finished. I run into this same problem both when I schedule the task flow and when I run it manually. It consolidates a few years and then stops (and it's still in the first stage). I looked in the Event log on the HFM server and don't see anything that looks suspicious. Has anyone else had this problem?
    Also one more question - when I go to automate the task flow, why does the server time stamp listed there not match the clock on the server? I'm remoted into the server and the clock on the task bar says 9am, but the Server Date showing on the Starting Event tab shows 8am.
    Thanks.
    Terri

    I just got back to looking at this today. I ran the task flow to consolidate 2004-2009 actuals for 3 trees for all 12 months. All 3 trees were consolidated for 2004. Two trees were consolidated for 2005 and then it stopped. Both 2004 and 2005 are in the same stage of the task flow, so if the settings were correct for 2004, they should be fine for 2005. If I look at the task audit screen, it shows that it started consolidating the 2nd tree of 2005 at 8:32am. It then shows my id logging off at 8:33am and that the consolidation finished at 8:38am. Then it shows my id logging off again at 8:47am. The task flow is set up so that my id logs in for the 1st stage and then the following stages are set to use Initiator ... but the flow isn't getting to the 2nd stage be/c it dies.
    If I view the status of my task flow, it still shows as being active. I don't understand why the task audit shows my id as logging off be/c I still have the same 2 HFM sessions running as I did when I initiated the task flow ... unless it's the session that starts up to run the task flow that's getting logged off ... but if it is, I don't know why. Would the auto-logoff after so many minutes of inactivity cause this? Maybe that session appears inactive even though it's waiting for the task flow to do it's thing.
    1:07pm Follow-up - After the task flow hung this morning, I manually stopped it. I then kicked off the same task flow again and it completed successfully about 4 hours later ... so it doesn't seem to be due to the auto-logoff (time out) setting.
    Terri
    Edited by: Terri T on Apr 20, 2009 10:07 AM

  • Unable to extract all the currencies using Task Flow

    Hi Everyone,
    I have 60 currencies that I extract using a Task Flow in HFM 11.1.2.1 to a SQL table, I only receive 59 of those currencies FX Rates in the SQL table. The CNY currency does not extract at all.
    I have tried changing the PoV around with no success. I don't think it is the PoV anyway because the other 59 currencies have no problem.
    Any inputs please...
    Regards,
    Vinod

    Sounds like this is using an EA extract process. In your Task Flow, can you extract just that one (CNY) currency? If you do this directly in Extended Analytics, is the output any different from the Task Flow? How about if you extract to a text file rather than SQL?

  • How to commit data at the end of a bounded task flow

    Hi all,
    I am using JDev 11.1.1.0.2.
    I have this situation
    1) A page with a button that goes to a task-flow to insert data (property data-control-scope set to shared and property transaction set to requires-transaction as suggested in http://www.oracle.com/technology/products/jdev/tips/fnimphius/cancelForm/cancelForm_wsp.html?_template=/ocom/print )
    2) At the end of this task-flow a I have a TaskFlowReturn (property End Transaction set to commit)
    When I click the button associated to this TaskFlowReturn, I return to the first page (described in 1) ), but the data I have just inserted are only submitted, but not committed.
    What's the problem?
    Any suggestions?
    Thanks
    Andrea

    Hi,
    if you set the return activity to commit the transaction then this is done. I don't see why rollback should work but commit doesn't
    Frank

  • Why : Data Control entry not expandable/empty - ADF Task flow based on HT.

    Hi All,
    I've hoping to create an ADF Task flow based on a human task. I have a stand alone ADF application within the ViewController of which I am attempting to create this task flow. The resultant data control entry doesn't seem to be expandable.
    Questions
    1. When creating an ADF Taskflow based on a human task, should it always be within a project in the same application as the SOA components?
    2. If no, is the resultant human task data control empty or not expandable because the xsd for the HT is based on the MDS? I have configured all the MDS connections within my application.
    JDev : 11.1.1.4
    Thanks
    PP

    Hi,
    Answers :-
    1. No, it is not necessary that your ADF Taskflow based on Human Task should be in the same application where the SOA Project Resides. But for the deployment of that ADF Taskflow you will need the SOA Project.You can add that project whenever you want to deploy.
    2. It might be the cause due to the MDS Configuration .
    Do one thing , while creating the ADF Taskflow based on Human Task , you do select the .task file via the file system not from the MDS.
    If you will select the .Task file of your soa project then it will ask you to name the Taskflow, just name the taskflow as you want and say ok.
    After creating the taskflow, it will add a lot of xml files in your project and it will create a data control of it.
    Hope it helps!!
    Regards,
    Shah

  • Human Task Flow conflict on Data Controls generation

    Hi, I have two distinct human tasks in one process. When I use new-> JSF-> ADF task flow from Human task to create the task flow for the second task to the same public-html/web-INF folder as the first one, after the generation, the data control for the first human task flow will disappear from the data control window.
    I am trying to rebuid my old app into a new app. Although I can create two distinct human task flows in my old app, no luck in the new app. Both apps use JDev 11.1.1.5 and on the same machine. Anybody can give me some hints on the issue?
    Thank you

    Juan C,
    I use JDeveloper 11g Release 1.
    May be I didn't explained my question correctly.
    taskdetails1 is creating, and in Data Controls I have objects of my BPM Human Task Payload.
    But in that file "PackageCreation.task" in source I can't find any link to instantly created TaskFlow.xml in my UI project.
    So, I have
    NEW project "PackageCreationUI" with PackageCreation_TaskFlow.xml in it (and TaskDetails1 file too).
    AND Did't Changed PackageCreation.task in BPM Project.
    If I use *"BPM form creation wizard"*, after creating project and TaskFlow in it I see Changes in PackageCreation.task in BPM Project, something like that:
    <taskFlowFileLocation>file:/C:/JDeveloper/mywork/testApp/PackageCreationUI/public_html/WEB-INF/PackageCreation_TaskFlow.xml</taskFlowFileLocation>

Maybe you are looking for

  • How can i do a re-install of snow leopard?

    it has been suggested that i do a fresh re-install of snow leopard to remedy my battery issues (extreme drop in charge with sn installation http://discussions.apple.com/thread.jspa?messageID=10258624). i'm new to macs, so i'm wondering how to go abou

  • What went wrong? GoLive CS site uploaded, noes not link

    Hello. I created yet another site in GoLive cs, and uploaded it. On the web, it displays the index page, and you can click a link, which takes you to another page, which is blank, suggesting info is missing. The site works great on my mac, just not a

  • CER Use port description as port location

    Hello all, I am trying to confirm if this "feature" is as useless as I have found it to be. From my testing and what little documentation I can find when checking the "Use port description as port location" check box on the LAN switch details screen

  • Runtime error  LOAD_TYPEPOOL_VERSION_MISMATCH

    hi i got this run time error for a stamdard transaction..can anyboby throw some light?           LOAD_TYPEPOOL_VERSION_MISMATCH           02.08.2007 21:23:11 ShrtText     Type group was changed at runtime. What happened?     Error in ABAP application

  • How remove extraneous entries in Finder file window?

    My Finder file window has three extraneous entries - one for an application I accidentally dropped outside the application lined, and two for CD burns that I completed some time ago. I have tried unsuccessfully to drag them to Trash. I also have expl