BFC 10 - Cube Designer deployment

Hi All,
We are using BFC 10 and Cube Designer.
There is a requirement to do the cube deployment once the consolidation  run is happened in BFC side. for the same consolidation ID.
Is there a way to achieve this... basically user doesnt want to open the Cube Designer (Extended Analytics) and deply it... wants to check  the reports without that.
Is there a way out?
pls let me know asap.

Hello,
Regarding your request, if you deployed the cube  one time in the Cube Designer and after that you did some modifications to the same consolidation in
Financial consolidation, it is not necessary to redeploy again the Cube through cube designer. You can just activate the live access option  in Cube Designer. In fact this option is used by Cube designer to schedule the synchronization between the SSAS Cube  and Financial Consolidation.
You will found bellow an explication regarding the options used by the live access:
• QueryPoll Interval. This option enables you to tell the system when to scan all the Business
Objects Financial Consolidation tables in order to detect modifications. If
modifications are detected, the system processes the synchronization.
• Silenceinterval and Silence override interval. These intervals will perform a partial
update. The complete interval to update the cube will be Query Poll Interval +
Silence Interval or Silence Overall Interval.
• Rebuildinterval. This allows you to specify how often the cube should be completely
processed, regardless of whether the structure has been modified or not.
Kind Regards,
Donia

Similar Messages

  • Sorting measures (calculated members) in SSAS cube designer

    Hi,
    In SSAS 2012 cube designer, on the "Calculations" tab I am writing some simple calculated members and displaying it in a folder (Display Folder=)
    FolderA
    am So this FolderA has the members as below :
    [<-A1], [-A1], [A],
    [A1], [A2], [A3], [A4], [A5],
    and [>A5] 
    While I am browsing the cube from Excel 2007, this FolderA is deployed correctly along with the above measures. All fine till here.
    However, while I expand FolderA, Sorting is not done correctly. It is displaying like below :
    <-A1, >A5, A, A1, -A1, A2, A3, A4, A5
    But I want it to be sorted correctly so that it looks good as below :
    <-A1, -A1, A, A1, A2, A3, A4, A5, >A5
    How to do this at cube level ? Any property or any work around (using the latest SSAS 2012 version)
    Thanks

     
    Hi Yvanlathem,
    I am not talking about sorting of folders/subfolders.
    I want to short calculated members within the folder
    As my first post itself says, For ex., 
    in FolderA
    this FolderA has the members as below :
    [<-A1], [-A1], [A], [A1], [A2], [A3], [A4], [A5], and [>A5] 
    However, while I browse cube and expand FolderA ;  Sorting is not done correctly. Currently it is displaying like below :
    <-A1, >A5, A, A1, -A1, A2,
    A3, A4, A5
    But I want it to be sorted correctly as below (After
    use of this solve order keyword it should sort as per below):
    <-A1, -A1, A,
    A1, A2, A3, A4, A5, >A5

  • ROLAP cube aggregation deployment failure

    Hi All
    My ROLAP cube aggregation deployment is failing with following error:
    dbms_odm.createcubeleveltuple ORA-06533: Subscript beyond count
    Has anyone got any idea about this failure please ?

    Hi Bonniejp,
    According to your description, it seems that it's a permission issue when deploy project to Analysis Services. By default, the default account for AS services has only public permission to few databases. If you haven't change the AS services
    account, or haven't grant the permission for this account, then you cannot deploy the AS project to server.
    So in your scenario, to avoid this issue, you can grant the SELECT permission for the default account for AS services to data source DB (Adventure Works DW2012). Or you can change the services account to this domain user who has the corresponding to the
    data source DB.
    If this is not what you want, pelase post the detail information about the error, so that we can make further analysis.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Best Practice For Cube Design

    All,
    First post here and was wondering if anyone out there has a best practice for cube design or optimisation. Currently have 7 Cubes that have been populated for the last 6 months and am now looking at ways of speeding up their population.
    Are there any hard and fast rules about dimensions?
    Should they be kept to a percentage of the fact table?
    When should line item dimensions be used?
    Regards
    Gary Boyle

    Hi Gary,
    Ideally the DIM tables should be 20% of the fact table and preferably less. You can check the size ratios in RSRV using the Database tables test > Database info about InfoProvider tables. Line items dimensions should be employed where the char has a large number of unique values (like 0MATERIAL, or 0CUSTOMER), so that anothe DIM ID is not created, but the SID values are used directly in the Fact Table.
    See these for more:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/media/uuid/10b589ad-0701-0010-0299-e5c282b7aaad
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/08f1b622-0c01-0010-618c-cb41e12c72be
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/6ce7b0a4-0b01-0010-52ac-a6e813c35a84
    Hope this helps...

  • Oracle OLAP Desktop-like cube designer application

    I am looking for any kind of Desktop-like cube designer application offered from Oracle. Using drap and drop, the end user can design cube without any programming knowledge (for example, no XML programming knowledge). I have tried an open source OLAP application called Pentaho Mondrian which has a GUI based Java application called Cube Designer. I have tried that application to design a basic cube for our business, but the problem is that it requires a lot of XML programming knowledge. Is there such a Desktop-like cube designer available from Oracle?
    Thanks,
    --xinhuan                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    Thanks for your information. By looking at Oracle OTN, I found several interesting products - Discoverer, Oracle Warehouse Builder (OWB) and Analytic Workspace Manager (AWM). There is a tutorial for how to create multi-dimentional cube using AWM on OTN. What are differences among those products? You mentioned OWB is a general-purpose ETL tool with creating cubes capability. Is OWB required for AWM to work? Should the data be stored as Star or Snowflake schema in the database using OWB tool so that the AWM can create cube based on the Star or Snowflake schema data? I am confused about the relationship among those tools.
    Thanks,
    --xinhuan                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Cube Designer needs BiPlatformWSUrl for BOE4.0 on Netweaver 7.3

    Hi,
    BOFC Cube Designer doesn't start. It shows the errormessage from note: https://websmp130.sap-ag.de/sap/support/notes/1622392
    The resolution descriped in the note doesn't help me.
    BOE 4.0 is running on a Netweaver 7.3 Web Application Server. The BiPlatformWSUrl from the note above isn't correct (http://myserver:50000/dswsbobje/services/Session) I've never seen something similar to this.
    I've also tried the normal URL to the CMC http://myserver:50000/BOE/CMC/ this doesn't work either.
    Can anyone help me finding the correct BiPlatformWSUrl for Cube Designer please.
    I would be really thankful....
    Best Regards

    Hi,
    you have posted on the wrong forum, please post it on Business Planning and Consolidations for NW.
    Best regards
    Roberto

  • BFC Extended Analytics or Cube Designer

    Hello
    We're building a business case on the benefits on using Extended Analytics over the reporting capabilities of BFC itself.  Anybody has a high-level document or presentation that outlines the functionalities of the application that he/she would like to share.
    Thanks beforehand.
    Ashwin

    Hi Deepesh,
    The same applies to Star Schemas, the main difference being that the star schema itself (the fact and dimension tables) are also stored in the relational DB.
    1. Yes, the member will be present, but I believe it will appear as unselected (unless of course you have selected your members by some filter or characteristic and this new value is within that selection)
    2. This is a tricky one.
    If you introduce a new dimension (which can only be a user-defined dimension and therefore you need dimensional analysis) it depends what was done with 'old' data.
    In Designer, the new dimension will be visible, but if you are using multi measures, you would need to select this new dimension in the "Dimensional Analysis" tab.
    You would now be in a situation where you have a new measure and you would as a first step you would need to deploy. Now, if in BFC you introduced this new dimension in (for example) 2014.04, all your data prior to this period would not have any values for this DA.
    In the cube this means that if you selected this new measure, you would not see any data for 2014.03, 2014.02 etc.
    This means the end-user needs to select both measures in the report- and this can be quite confusing. You can use 'single measure' mode to avoid this
    3. If you want to use this new characteristic as a hierarchy, you would need to select it in the hierarchy tab and re-deploy
    Thanks
    Marc

  • Purchasing cube designing, update rule mapping for 0calday

    Hi All,
       Iam designing cube for Purchasing module in BW. Iam having doubt in mapping for 0calday in update rules, is it with document date or scheduline date. I have searched in SAP cube 0PUR_C01, it is mapped with 0SCHED_DATE( schedule line date). How the difference comes when we choose document date and schedule line date.
      FYI, we are using only schedule line data source, which is giving relevant information.
    Thanks,
    Ram

    Hi Ram,
    Please note:
    0SCHED_DATE: The schedule line date is the day on which the scheduled quantity of the material is to be delivered.
    0DOC_DATE: The date on which document has been craeted.
    Use the first one for purchasing.
    Thanks...
    Shambhu

  • Adding WHERE condition in "Calculations" tab of SSAS Cube Designer

    HI,
    I am using SSAS 2012 version to develop SSAS cube
    and need help on following scenario :
    In the BIDS designer, on the Calculations tab: 
    I am creating one calculated member called [% Jobs] for which I am writing MDX expression as below :
    '([Measures].[On Time jobs]/[Measures].[Total Jobs])*100'
    Till here all is well... 
    Now, How do I implement here WHERE condition ? 
    For ex., in my fact table I have a column Line_Number
    I want to implement above MDX expression such that it only considers Line_Number=1
    while doing the above calculation.
    Hope it is clear. 
    Thanks...

    I tried with below method :
    While keeping the expression as it is :
    '([Measures].[On Time jobs]/[Measures].[Total Jobs])*100'
    What I did was in the partition tab, i choose option "Query binding" (instead of Table binding) and over there I mentioned the condition 
    WHERE Line_Number=1
    So it should do the calculations only for the first line.
    Is this approach also correct ?
    Thanks 

  • Is it possible to open a cube currently deploy back on VS?

    *SSAS Version 2008 R2.
    Can I import a cube that has already been deployed back into Visual Studio?
    I'm trying to troubleshoot an issue with one of the calculated measures in a cube, but the developers do not have the solution files available. 
    Do I have the option of extracting/exporting my cube so it can be changed in Visual Studio?

    A little more research would have helped. 
    This link below answered my question:
    http://www.sqlservercentral.com/Forums/Topic786526-147-1.aspx
    From the link:
    Open up BIDS (VS) and instead of connecting to the SSAS DB/Cube,
    Select New Projct. In the project type,
    Select 'Import Analysis Services 9.0 Database' (in 2008 it replaces '9.0' with '2008'),
    Run through the wizard which basically requires an AS server and DB, voila, you have a project based on the server copy.

  • CUBE - New Deployment Issue - Not working DTMF Relay

    Hello,
    Scheme:
    Cisco SCCP-based IP Phone > CUCM 9.1 w/ SIP Trunk > CUBE (28XX, 151-4.M7) > SIP ITSP
    CUCM Active Call Proc. Node IP: 10.10.10.9
    CUBE Inside Interface IP: 10.10.10.10
    CUBE Outside Interface IP: 20.20.20.20
    Cisco IP Phone: 10.10.10.8
    ITSP SBC IP: 30.30.30.30
    ITSP SIP domain: itsp.domain
    Calling Pty: 9017654321 (translated in CUCM's route pattern which addresses CUBE)
    Called Pty: 9011234567
    While call was connected calling party dialed consequently 0,1,2,3,4 but far-end IVR does not react :(
    Symptom:
    While outbound call is connected calling party (IP Phone) dials digits which are not detected by any far-end PSTN (non-corporate) IVR at all.
    Thoughts:
    ITSP support only inband relay (RFC2833, Named Telephone Events or NTEs).
    Using NTE provides a standard way to transport DTMF tones in RTP packets.
    Thus rtp-nte is configured for both CUCM and ITSP dial-peers on CUBE.
    While initial troubleshooting found that for the active call inbound CUBE's leg shows rtp-nte, but outbound inband-voice.
    A have an assumption that ITSP doesn't give us 101=rtp-nte payload in 183 Response but I'm not sure.
    m=audio 10318 RTP/AVP 8
    b=AS:64
    a=rtpmap:8 PCMA/8000
    a=ptime:20
    a=maxptime:20
    Questions:
    1.How to make CUBE to successfully relay DTMF in according to ITSP requirement?
    2. Why 'show call act/hist voice brief' doesn't show call id? All my attempts are identified as 2... )
    It is hard to differentiate b/w call active/history records..

     Ayodeji,
    Thanks for your feedback.
    If you look through the already attached output 'show call act voice' of the file 'case-no-dtmf_-_cube-show-20140210-1.txt' you will find the following:
    PeerId=101
    CallOrigin=2
    tx_DtmfRelay=rtp-nte
    CallDuration=00:00:05 sec
    PeerId=201
    CallOrigin=1
    tx_DtmfRelay=inband-voice
    CallDuration=00:00:05 sec
    2    : 230 18:40:38.048 MSK Tue Feb 10 2015.1 +1890 pid:101 Answer 79017654321 active
     dur 00:00:04 tx:234/37440 rx:233/37280
     IP 10.10.10.8:22688 SRTP: off rtt:0ms pl:0/0ms lost:0/0/0 delay:0/0/0ms g711alaw TextRelay: off
     media inactive detected:n media contrl rcvd:n/a timestamp:n/a
     long duration call detected:n long duration call duration:n/a timestamp:n/a
    2    : 231 18:40:38.068 MSK Tue Feb 10 2015.1 +1860 pid:201 Originate 79011234567 active
     dur 00:00:04 tx:233/37280 rx:307/49120
     IP 30.30.30.30:10318 SRTP: off rtt:0ms pl:0/0ms lost:0/0/0 delay:0/0/0ms g711alaw TextRelay: off
     media inactive detected:n media contrl rcvd:n/a timestamp:n/a
     long duration call detected:n long duration call duration:n/a timestamp:n/a
    This means that the target dial-peers (inbound and outbound are matched as designed).
    dial-peer voice 101 voip
     description -= inbound leg from CUCM to CUBE =-
     session protocol sipv2
     incoming called-number x
     voice-class codec 1  
     voice-class sip bind control source-interface Loopback0
     voice-class sip bind media source-interface Loopback0
     dtmf-relay rtp-nte
     no vad
    dial-peer voice 201 voip
     description -= outbound leg from CUBE to ITSP =-
     translation-profile outgoing cdpn-delete-prefix-00XX7
     max-conn 40
     destination-pattern x
     session protocol sipv2
     session target dns:sbc.itsp.domain
     voice-class codec 1  
     voice-class sip profiles 1
     voice-class sip bind control source-interface Vlan100
     voice-class sip bind media source-interface Vlan100
     dtmf-relay rtp-nte
     no vad
    Now I have an argue with ITSP to make them send me NTE in their 183/200 response..
    I've also:
    1. Tried to disable dtmf-relay at all on dial-peers (trying inband-voice) but this doesn't work and also not recommended AFAIK.
    2. Changed the value for SIP Trunk DTMF Signaling Method from 'No preference' to 'RFC 2833' w/ reset applied recommended by Suresh. No luck.

  • About Cube design

    Hi,
    i'd like know if in AWM I can have a single cube with multiple fact tables Fact tables share any dimension, otherwise others are specific of a single fact table (or each fact table needs a different cube) and if a member of the cube can be mapped only by one fact table and not necessarly by both.
    Thanks
    Giancarlo

    I would be able to answer well if you can you describe your facts and elaborate what you want to achieve? In earlier post I was trying to say that suppose you have a fact with dim1,dim2,dim3 with measure1 and you create one cube for it say cube1.
    Now do you mean that you have another fact with dimensionality as dim4,dim5,dim6 with measure2? If yes then you should create a second cube with dimensionality as dim4,dim5,dim6 with measure2.
    On the other hand if you have second fact as dim1,dim2,dim3 with measure2 then in this case you can have only one cube with two measures and the source to load the cube could look like this
    create view cube_source as select dim1,dim2,dim3,sum(measure1),sum(measure2) from(
    select dim1,dim2,dim3,measure1,null from fact1
    union all
    select dim1,dim2,dim3,null,measure2 from fact)
    group by dim1,dim2,dim3
    Above is my perception but I am not sure if you I really understood your question correctly. Can you elaborate what you are trying to achieve?
    Thanks,
    Brijesh

  • Error in deploying an OLAP cube with OWB ---- Oracle ACE's , please help me

    I am using OWB 10.2 for building a BI application . All goes well until I tried to deploy an OLAP relational cube . When I cofigure an OLAP cube -deplyoment options- with 'deploy all' value ie OWB genarete un pl/sql script with cmw2 procedures , Control Center Manager can't deploy my cube and give me this log:
    ORA-06510 : PL/SQL unhandled user-defined exception
    ORA-06512 : at OLAPSYS.CMW2_OLAP_UTILITY , line 1660
    ORA-01403 : no data found
    ORA-06512 : at "OLAPSYS.CMW2_OLAP_CUBE, line 33
    ORA-06512 : at "OLAPSYS.CMW2_OLAP_CUBE, line 55
    ORA-06512 : at "OLAPSYS.CMW2_OLAP_CUBE, line 386
    ORA-06512 : at line 5
    What do you think is wrong?
    TIA

    Ragnar is quite correct. When you design a cube there is an associated table generated and FK constraints are created that link back to the related dimension objects and their associated tables. The OLAP dimensions and cubes are just pieces of metadata and don't actually exist as first class objects in their own right. Most importantly don't confuse OLAP dimensions with relational dimensions that are created to support query re-write.
    In Control Center if you check the Table node in your tree you should see a list of tables that are the relational source objects for your dimensions and cubes. Deploy the dimension tables first, then deploy the dimensions using the "Deploy All" option. Lastly depoly your table followed by the cube.
    In Control Center it is possible to force OWB to deploy all related objects. I think it is on one of the menu options within Design Center. I will see if I can find the exact details when I next have OWB open.
    Hope this helps
    Keith Laker
    Oracle EMEA Consulting
    BI Blog: http://oraclebi.blogspot.com/
    DM Blog: http://oracledmt.blogspot.com/
    BI on Oracle: http://www.oracle.com/bi/
    BI on OTN: http://www.oracle.com/technology/products/bi/
    BI Samples: http://www.oracle.com/technology/products/bi/samples/

  • BOFC / EA: Deployment to MS Cubes, parallel creation universe fails

    Dear SAP BOFC Experts,
    we face a problem during the deployment of BOFC data to MS cubes. We would like to create a BOBJ universe with every deployment, so we did following:
    - created a target folder in the CMC
    - in EA Designer, active the checkbox for "Automatically create associated universe" under options of the deployment
    - entered this folder (/webi universes/EA Universes) and the name (EA_Unv_1) of the universe
    - saved deployment, started deployment
    - deployment works fine, but universe creation fails with the message:
    "EA_Unv_1' universe creation failed, CMSName is empty"
    - then we found out the the universe designer has to be installed on the server the deployment service runs on. It wasn't, so we installed it and started it with designer.exe /regserver, worked fine. Then we rebooted the deployment server.
    - we tried to deploy again: deployment is ok, but universe creation fails with the same error message.
    Any help or idea is highly appreciated!
    Thanks in advance & kind regards,
    Bastian

    Hi Bastian,
    I wonder if you have resolved the problem with the universe not being created.
    I have problem with the Cube Designer creating universe. The messages during cube deployment show that the cube is processed, connection is opened and universe creation started. Then a message appears that universe creation failed.
    We have the BOE installed on a different server from the server running SSAS and Cube designer.
    Regards,
    Minka Tinkova

  • SQL SERVER 2012 cube deployment error

    I am a first time SQL server user trying to do the Analysis Services Tutorials. I am on the last part of Lesson 2 where the cube is deployed. Here is my error message. Note: I did have permission issues originally when trying upload AdventureWorksDW2012.
    Can anyone help me here?
    Warning    1    Dimension [Date] : Create hierarchies in non-parent child dimensions.        0    0    
    Warning    2    Dimension [Customer] : Create hierarchies in non-parent child dimensions.        0    0    
    Warning    3    Dimension [Product] : Create hierarchies in non-parent child dimensions.        0    0    
    Error    4    Internal error: The operation terminated unsuccessfully.        0    0    
    Error    5    Server: The current operation was cancelled because another operation in the transaction failed.        0    0    
    Error    6    OLE DB error: OLE DB or ODBC error: Login failed for user 'NT Service\MSSQLServerOLAPService'.; 28000.        0    0    
    Error    7    Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW2012', Name of 'Adventure Works DW2012'.        0  
     0    
    Error    8    Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed.        0    0    
    Error    9    Errors in the OLAP storage engine: An error occurred while the 'Geography Key' attribute of the 'Customer' dimension from the 'Analysis Services t' database was being processed.        0  
     0    
    Error    10    OLE DB error: OLE DB or ODBC error: Login failed for user 'NT Service\MSSQLServerOLAPService'.; 28000.        0    0    
    Error    11    Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW2012', Name of 'Adventure Works DW2012'.        0  
     0    
    Error    12    Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed.        0    0  
    Error    13    Errors in the OLAP storage engine: An error occurred while the 'Birth Date' attribute of the 'Customer' dimension from the 'Analysis Services t' database was being processed.        0  
     0    

    So i went into SQL server configuration manager, right clicked on SQL Service Analysis Services and chose the built-in (On network)but now i have even more errors. 57 this time.
    Warning    1    Dimension [Date] : Create hierarchies in non-parent child dimensions.        0    0    
    Warning    2    Dimension [Customer] : Create hierarchies in non-parent child dimensions.        0    0    
    Warning    3    Dimension [Product] : Create hierarchies in non-parent child dimensions.        0    0    
    Error    4    Internal error: The operation terminated unsuccessfully.        0    0    
    Error    5    OLE DB error: OLE DB or ODBC error: Login failed for user 'WORKGROUP\COOK-PC$'.; 28000.        0    0    
    Error    6    Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW2012', Name of 'Adventure Works DW2012'.        0  
     0    
    Error    7    Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed.        0    0    
    Error    8    Errors in the OLAP storage engine: An error occurred while the 'Total Children' attribute of the 'Customer' dimension from the 'Analysis Services t' database was being processed.        0  
     0    
    Error    9    Server: The current operation was cancelled because another operation in the transaction failed.        0    0    
    Error    10    OLE DB error: OLE DB or ODBC error: Login failed for user 'WORKGROUP\COOK-PC$'.; 28000.        0    0    
    Error    11    Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW2012', Name of 'Adventure Works DW2012'.        0  
     0    
    Error    12    Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed.        0    0  
    Error    13    Errors in the OLAP storage engine: An error occurred while the 'Marital Status' attribute of the 'Customer' dimension from the 'Analysis Services t' database was being processed.        0  
     0    
    Error    14    OLE DB error: OLE DB or ODBC error: Login failed for user 'WORKGROUP\COOK-PC$'.; 28000.        0    0    
    Error    15    Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW2012', Name of 'Adventure Works DW2012'.        0  
     0    
    Error    16    Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed.        0    0  
    Error    17    Errors in the OLAP storage engine: An error occurred while the 'English Education' attribute of the 'Customer' dimension from the 'Analysis Services t' database was being processed.      
     0    0    

Maybe you are looking for

  • QuickTime in Safari cannot play video after upgrading to iTunes 10.2

    After upgrading to iTunes 10.2 and QuickTime 10 on my MacBook Pro, the latest Quicktime version embedded in Safari is behaving strangely. When I want to play video on certain websites like " http://www.lynda.com/Aperture-3-tutorials/essential-trainin

  • How do i create a hyperlink in pages 5.1

    In the past I used the inspector.. Now it's not available. 

  • Trouble with Monitor color

    (I Hope this is the right place for this post) I had my imac at the Mac store, they put in t new logic board and a new power supply. Got it home and the color is WAY out of wack! The color of the pictures in iphoto, websites, everything. It looks ver

  • WIFI DO NOT WORK

    My wifi is not working after I have installed the new version 6.1.3. The color is grey and I've tried already several possible solutions, restore, download again, fridger,, reset everything. Nothing works...

  • Flash Media Live Encoder 3 Audio Input Level Resets (XP)

    I am using Flash Media Live Encoder 3 on a Windows XP computer, with an Ospery-100 capture card for video and a generic sound card line in for audio input. It works fine, except for every time I launch the encoder or even just stop and start encoding