Analyze and optimization

Hello everyone,
I'm on this subject for now 3 weeks and i need help.
I'm trainee in a company where i have to analyze and optimize their GPO as simple as that, I so learn in detail how does this tool work ect and other useful things about Active Directory.
I learned their 60 gpos (Some rules up to 600 settings...) and their thousands parameters which is essential for me and during my searches i found many many many softwares to detect parameters conflicts or duplicated settings, but after all my tries i'm
not satisfied today by what i found.
I used a trial version of GPOAdmin, the GPO Reporting pack from SDM, probably all the Microsoft tools, ActiveAdministrator ect ... I mean all these tools are very powerful and allow many features but i just need something that will find and tell me where
are all my conflicts on my domain and by this I will correct these settings to have a full capable domain optimized and users won't complain anymore because they'll have a faster logon ect...
Maybe I don't use the products as i should or maybe it doesn't even exist but it seems very long to analyze all by my self and write every parameter on each object that will be applied and check if there won't be conflict or another GPO for this setting.
Maybe Powershell can help me on this but I don't know how to use it to.
So here I am and if you have any idea to help me on the best practice or someone had to do the same job as I have tell me I'll be very happy to receive your information.
Thanks and sorry for my English.

> I mean that if there are 10 gpo for the domain and 10 others on children
> UO, some parameters will be overwritten (Conflict) or the same
> parameters will be set 5 times (Duplication).
Yes, that's true. But setting a simple registry key takes a time windows
cannot even log to the gpsvc.log file. This is from a VM running on a
desktop system concurrently with 4 other VMs:
GPSVC(478.d68) 11:48:19:813 SetRegistryValue: 1 =>
Microsoft.CredentialManager  [OK]
GPSVC(478.d68) 11:48:19:813 SetRegistryValue: 2 => Microsoft.GetPrograms
 [OK]
GPSVC(478.d68) 11:48:19:813 SetRegistryValue: 3 => Microsoft.HomeGroup  [OK]
GPSVC(478.d68) 11:48:19:813 SetRegistryValue: 4 =>
Microsoft.iSCSIInitiator  [OK]
GPSVC(478.d68) 11:48:19:813 SetRegistryValue: 5 =>
Microsoft.ParentalControls  [OK]
GPSVC(478.d68) 11:48:19:813 SetRegistryValue: 6 =>
Microsoft.PeopleNearMe  [OK]
GPSVC(478.d68) 11:48:19:813 SetRegistryValue: 7 =>
Microsoft.UserAccounts  [OK]
GPSVC(478.d68) 11:48:19:829 SetRegistryValue: 8 =>
Microsoft.WindowsAnytimeUpgrade  [OK]
And even here it takes only about 1 ms average - on a real system, this
is about 50 times faster.
Martin
Mal ein
GUTES Buch über GPOs lesen?
NO THEY ARE NOT EVIL, if you know what you are doing:
Good or bad GPOs?
And if IT bothers me - coke bottle design refreshment :))

Similar Messages

  • SQL Tuning and OPTIMIZER - Execution Time with  " AND col .."

    Hi all,
    I get a question about SQL Tuning and OPTIMIZER.
    There are three samples with EXPLAIN PLAN and execution time.
    This "tw_pkg.getMaxAktion" is a PLSQL Package.
    1.) Execution Time : 0.25 Second
    2.) Execution Time : 0.59 Second
    3.) Execution Time : 1.11 Second
    The only difference is some additional "AND col <> .."
    Why is this execution time growing so strong?
    Many Thanks,
    Thomas
    ----[First example]---
    Connected to Oracle Database 10g Enterprise Edition Release 10.2.0.3.0
    Connected as dbadmin2
    SQL>
    SQL> EXPLAIN PLAN FOR
      2  SELECT * FROM ( SELECT studie_id, tw_pkg.getMaxAktion(studie_id) AS max_aktion_id
      3                    FROM studie
      4                 ) max_aktion
      5  WHERE max_aktion.max_aktion_id < 900 ;
    Explained
    SQL> SELECT * FROM TABLE(dbms_xplan.display);
    PLAN_TABLE_OUTPUT
    Plan hash value: 3201460684
    | Id  | Operation            | Name        | Rows  | Bytes | Cost (%CPU)| Time
    |   0 | SELECT STATEMENT     |             |   220 |   880 |     5  (40)| 00:00:
    |*  1 |  INDEX FAST FULL SCAN| SYS_C005393 |   220 |   880 |     5  (40)| 00:00:
    Predicate Information (identified by operation id):
       1 - filter("TW_PKG"."GETMAXAKTION"("STUDIE_ID")<900)
    13 rows selected
    SQL>
    Execution time (PL/SQL Developer says): 0.25 seconds
    ----[/First]---
    ----[Second example]---
    Connected to Oracle Database 10g Enterprise Edition Release 10.2.0.3.0
    Connected as dbadmin2
    SQL>
    SQL> EXPLAIN PLAN FOR
      2  SELECT * FROM ( SELECT studie_id, tw_pkg.getMaxAktion(studie_id) AS max_aktion_id
      3                    FROM studie
      4                 ) max_aktion
      5  WHERE max_aktion.max_aktion_id < 900
      6    AND max_aktion.max_aktion_id <> 692;
    Explained
    SQL> SELECT * FROM TABLE(dbms_xplan.display);
    PLAN_TABLE_OUTPUT
    Plan hash value: 3201460684
    | Id  | Operation            | Name        | Rows  | Bytes | Cost (%CPU)| Time
    |   0 | SELECT STATEMENT     |             |    11 |    44 |     6  (50)| 00:00:
    |*  1 |  INDEX FAST FULL SCAN| SYS_C005393 |    11 |    44 |     6  (50)| 00:00:
    Predicate Information (identified by operation id):
       1 - filter("TW_PKG"."GETMAXAKTION"("STUDIE_ID")<900 AND
                  "TW_PKG"."GETMAXAKTION"("STUDIE_ID")<>692)
    14 rows selected
    SQL>
    Execution time (PL/SQL Developer says): 0.59 seconds
    ----[/Second]---
    ----[Third example]---
    SQL> EXPLAIN PLAN FOR
      2  SELECT * FROM ( SELECT studie_id, tw_pkg.getMaxAktion(studie_id) AS max_aktion_id
      3                    FROM studie
      4                 ) max_aktion
      5  WHERE max_aktion.max_aktion_id < 900
      6    AND max_aktion.max_aktion_id <> 692
      7    AND max_aktion.max_aktion_id <> 392;
    Explained
    SQL> SELECT * FROM TABLE(dbms_xplan.display);
    PLAN_TABLE_OUTPUT
    Plan hash value: 3201460684
    | Id  | Operation            | Name        | Rows  | Bytes | Cost (%CPU)| Time
    |   0 | SELECT STATEMENT     |             |     1 |     4 |     6  (50)| 00:00:
    |*  1 |  INDEX FAST FULL SCAN| SYS_C005393 |     1 |     4 |     6  (50)| 00:00:
    Predicate Information (identified by operation id):
       1 - filter("TW_PKG"."GETMAXAKTION"("STUDIE_ID")<900 AND
                  "TW_PKG"."GETMAXAKTION"("STUDIE_ID")<>692 AND
                  "TW_PKG"."GETMAXAKTION"("STUDIE_ID")<>392)
    15 rows selected
    SQL>
    Execution time (PL/SQL Developer says): 1.11 seconds
    ----[/Third]---Edited by: thomas_w on Jul 9, 2010 11:35 AM
    Edited by: thomas_w on Jul 12, 2010 8:29 AM

    Hi,
    this is likely because SQL Developer fetches and displays only limited number of rows from query results.
    This number is a parameter called 'sql array fetch size', you can find it in SQL Developer preferences under Tools/Preferences/Database/Advanced tab, and it's default value is 50 rows.
    Query scans a table from the beginning and continue scanning until first 50 rows are selected.
    If query conditions are more selective, then more table rows (or index entries) must be scanned to fetch first 50 results and execution time grows.
    This effect is usually unnoticeable when query uses simple and fast built-in comparison operators (like = <> etc) or oracle built-in functions, but your query uses a PL/SQL function that is much more slower than built-in functions/operators.
    Try to change this parameter to 1000 and most likely you will see that execution time of all 3 queries will be similar.
    Look at this simple test to figure out how it works:
    CREATE TABLE studie AS
    SELECT row_number() OVER (ORDER BY object_id) studie_id,  o.*
    FROM (
      SELECT * FROM all_objects
      CROSS JOIN
      (SELECT 1 FROM dual CONNECT BY LEVEL <= 100)
    ) o;
    CREATE INDEX studie_ix ON studie(object_name, studie_id);
    ANALYZE TABLE studie COMPUTE STATISTICS;
    CREATE OR REPLACE FUNCTION very_slow_function(action IN NUMBER)
    RETURN NUMBER
    IS
    BEGIN
      RETURN action;
    END;
    /'SQL array fetch size' parameter in SQLDeveloper has been set to 50 (default). We will run 3 different queries on test table.
    Query 1:
    SELECT * FROM ( SELECT studie_id, very_slow_function(studie_id) AS max_aktion_id
                         FROM studie
                  ) max_aktion
    WHERE max_aktion.max_aktion_id < 900
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      1      0.00       0.00          0          0          0           0
    Fetch        1      1.22       1.29          0       1310          0          50
    total        3      1.22       1.29          0       1310          0          50
    Misses in library cache during parse: 1
    Optimizer mode: ALL_ROWS
    Parsing user id: 93  (TEST)
    Rows     Row Source Operation
         50  INDEX FAST FULL SCAN STUDIE_IX (cr=1310 pr=0 pw=0 time=355838 us cost=5536 size=827075 card=165415)(object id 79865)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
         50   INDEX   MODE: ANALYZED (FAST FULL SCAN) OF 'STUDIE_IX' (INDEX)Query 2:
    SELECT * FROM ( SELECT studie_id, very_slow_function(studie_id) AS max_aktion_id
                         FROM studie
                  ) max_aktion
    WHERE max_aktion.max_aktion_id < 900
          AND max_aktion.max_aktion_id > 800
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.01          0          0          0           0
    Execute      1      0.00       0.00          0          0          0           0
    Fetch        1      8.40       8.62          0       9351          0          50
    total        3      8.40       8.64          0       9351          0          50
    Misses in library cache during parse: 1
    Optimizer mode: ALL_ROWS
    Parsing user id: 93  (TEST)
    Rows     Row Source Operation
         50  INDEX FAST FULL SCAN STUDIE_IX (cr=9351 pr=0 pw=0 time=16988202 us cost=5552 size=41355 card=8271)(object id 79865)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
         50   INDEX   MODE: ANALYZED (FAST FULL SCAN) OF 'STUDIE_IX' (INDEX)Query 3:
    SELECT * FROM ( SELECT studie_id, very_slow_function(studie_id) AS max_aktion_id
                         FROM studie
                  ) max_aktion
    WHERE max_aktion.max_aktion_id = 600
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.01       0.00          0          0          0           0
    Execute      1      0.00       0.00          0          0          0           0
    Fetch        1     18.72      19.16          0      19315          0           1
    total        3     18.73      19.16          0      19315          0           1
    Misses in library cache during parse: 1
    Optimizer mode: ALL_ROWS
    Parsing user id: 93  (TEST)
    Rows     Row Source Operation
          1  INDEX FAST FULL SCAN STUDIE_IX (cr=19315 pr=0 pw=0 time=0 us cost=5536 size=165415 card=33083)(object id 79865)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
          1   INDEX   MODE: ANALYZED (FAST FULL SCAN) OF 'STUDIE_IX' (INDEX)Query 1 - 1,29 sec, 50 rows fetched, 1310 index entries scanned to find these 50 rows.
    Query 2 - 8,64 sec, 50 rows fetched, 9351 index entries scanned to find these 50 rows.
    Query 3 - 19,16 sec, only 1 row fetched, 19315 index entries scanned (full index).
    Now 'SQL array fetch size' parameter in SQLDeveloper has been set to 1000.
    Query 1:
    SELECT * FROM ( SELECT studie_id, very_slow_function(studie_id) AS max_aktion_id
                         FROM studie
                  ) max_aktion
    WHERE max_aktion.max_aktion_id < 900
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      1      0.00       0.00          0          0          0           0
    Fetch        1     18.35      18.46          0      19315          0         899
    total        3     18.35      18.46          0      19315          0         899
    Misses in library cache during parse: 0
    Optimizer mode: ALL_ROWS
    Parsing user id: 93  (TEST)
    Rows     Row Source Operation
        899  INDEX FAST FULL SCAN STUDIE_IX (cr=19315 pr=0 pw=0 time=20571272 us cost=5536 size=827075 card=165415)(object id 79865)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
        899   INDEX   MODE: ANALYZED (FAST FULL SCAN) OF 'STUDIE_IX' (INDEX)Query 2:
    SELECT * FROM ( SELECT studie_id, very_slow_function(studie_id) AS max_aktion_id
                         FROM studie
                  ) max_aktion
    WHERE max_aktion.max_aktion_id < 900
          AND max_aktion.max_aktion_id > 800
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      1      0.00       0.00          0          0          0           0
    Fetch        1     18.79      18.86          0      19315          0          99
    total        3     18.79      18.86          0      19315          0          99
    Misses in library cache during parse: 0
    Optimizer mode: ALL_ROWS
    Parsing user id: 93  (TEST)
    Rows     Row Source Operation
         99  INDEX FAST FULL SCAN STUDIE_IX (cr=19315 pr=0 pw=0 time=32805696 us cost=5552 size=41355 card=8271)(object id 79865)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
         99   INDEX   MODE: ANALYZED (FAST FULL SCAN) OF 'STUDIE_IX' (INDEX)Query 3:
    SELECT * FROM ( SELECT studie_id, very_slow_function(studie_id) AS max_aktion_id
                         FROM studie
                  ) max_aktion
    WHERE max_aktion.max_aktion_id = 600
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      1      0.00       0.00          0          0          0           0
    Fetch        1     18.69      18.84          0      19315          0           1
    total        3     18.69      18.84          0      19315          0           1
    Misses in library cache during parse: 0
    Optimizer mode: ALL_ROWS
    Parsing user id: 93  (TEST)
    Rows     Row Source Operation
          1  INDEX FAST FULL SCAN STUDIE_IX (cr=19315 pr=0 pw=0 time=0 us cost=5536 size=165415 card=33083)(object id 79865)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
          1   INDEX   MODE: ANALYZED (FAST FULL SCAN) OF 'STUDIE_IX' (INDEX)And now:
    Query 1 - 18.46 sec, 899 rows fetched, 19315 index entries scanned.
    Query 2 - 18.86 sec, 99 rows fetched, 19315 index entries scanned.
    Query 3 - 18.84 sec, 1 row fetched, 19315 index entries scanned.

  • Performance analysis and optimization tools

    Hello I am looking for some tools for Performance analysis and optimization for Oracle. For now I looked over Spotlight, Ignite and Embarcadero DB Optimizer. Can you please point out some links or something for comparing such tools?
    What tools do you use?
    Thanks,

    For performance analysis you can use AWR and ASH.
    -- How to analyze AWR/statpack
    http://jonathanlewis.wordpress.com/statspack-examples/
    -- how to take AWR and ASH report
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/autostat.htm#PFGRF02601
    http://www.oracle.com/technology/pub/articles/10gdba/week6_10gdba.html
    Regards
    Asif Kabir

  • Difference between Bex Analyzer and Bex WAD

    Hi Every one,
    I am pretty new to SAP, and working on SAP BW Front end.
    I am pretty aware of Bex Analyzer and also Browser, but i always have a doubt about Bex WAD.
    In my present job i talk to the users and take the requirments and create queries in the Bex analyzer which i am comfortable doing it.
    I also know all the features of Analyzer and created quries and also published it on web, and also created workbooks and inserting multiple quries in the workbook etc etc..
    However i have zero knowledge on WAD and how it is useful to me, and as if now i haven't got any requirment from the client for me to create any report in WAD or do some thing in WAD.
    I have gone through SAP SDN help etc but i am not understanding it at all.
    So can any one explain me in a laymans terms what WAD is and how it will be useful to me?
    and also if possible send me link which has got PDF screen shots of what WAD is?
    Points is for sure.
    Regards,
    Ram

    Raj,
    Thanks for the reply, but i still did not get the understanding about WAD. i have studied the link which you sent me.
    I am still not able to understand how it is helpful for people like me,
    See i create my report in Query designer, and i can execute the query in Analyser. and if i want i can execute it in the Web from my query designer, and also i can create chart on the web if i want.And i can Broadcaste it in the Web.
    and also if i want i can create it in the workbook and create chart in that work book.
    Then what is the point of Web Application Designer (WAD).
    See i have created the Query in Designer in Development say Sales report Query on the infoprovider for SD, now what should i do with it in WAD.
    How can i get it in WAD, i went to start> all Programs>Bex> WAD and i logged in to development and a sereen opend. And from here what i can do, i mean how can i open my created query in WAD?
    As you guys know i am new to SAP itself, but i dont see any advantage about WAD, i read all the articals and there i found that i can create Web Items, HTML etc etc, but i already have HTML, when i execute my query from the Designer i get it in Web Browser...
    See my question is simple, i have a query which i have created in the Query Designer, now how can i get it to WAD and after getting it what can i do with it??
    If any one has a screen shots of there pls send me.
    And Sanjeev has sent me a PDF which i saw earlear and i done the same in my WAD but i am not about to get it to work.
    I have given the points to every one but still what i need i did not get yet.
    Regards,
    Ram

  • Differentiate between mapping and optimization.

    Hi
    tell me some thing about this.
    Differentiate between mapping and optimization.
    please
    urgent. imran

    user571615 wrote:
    Hi
    tell me some thing about this.
    Differentiate between mapping and optimization.
    please
    urgent. imranThis is a forum of volunteers. There is no urgent here. For urgent, buy yourself a support contract and open an SR on MetaLink.

  • How to capture a .gif file from a spectrum analyzer and save the file in PC

    I want to capture a .gif file from a spectrum analyzer and save the file in PC, but I've got a problem when read data from the instrument. I'm not sure how to format the string got from the instrument , When I use "%s" or "%t" as the read string format the data got from the instrument is truncated.
    my code are as follows, could anyone tell me where i am wrong?
    char resultsArray[5000];
    viPrintf(hSpectrumInstr, ":MMEMTORCR 'CICTURE.GIF'\n");
     viQueryf(hSpectrumInstr, ":MMEMATA? 'CICTURE.GIF'\n", "%t", resultsArray);
     printf("%s", resultsArray);
     getchar();       
    Solved!
    Go to Solution.
    Attachments:
    readResult.docx ‏50 KB

    char resultsArray[5000];
    viPrintf(hSpectrumInstr, ":MMEMTORCR 'CICTURE.GIF'\n");
    viQueryf(hSpectrumInstr, ":MMEMATA? 'CICTURE.GIF'\n", "%b", resultsArray);

  • Code Generation and Optimization

    hi,
    we have been looking into a problem with our app, running in a JRockit JVM, pausing. Sometimes for up to a second or so at a time.
    We enabled the verbose logging of codegen and opt.
    And we see entries like:
    [Fri Sep  3 09:51:38 2004][28810][opt    ] #114 0x200 o com/abco/util/Checker.execute(Lcom/abco/util/;)V
    [Fri Sep  3 09:51:39 2004][28810][opt    ] #114 0x200 o @0x243c4000-0x243c7740 1186.23 ms (14784.88 ms)
    So the optimization took 1186 ms.
    Does this optimization happen in the main thread?
    I.e., is the above message an indication that the app had to stop for 1186ms while the optimization occurred?
    Any help on this would be greatly appreciated!
    Also, does anyone have any more pointers to info on the code generation and optimization in JRockit?
    I have only managed to find the following:
    http://edocs.beasys.com/wljrockit/docs142/intro/understa.html#1015273
    thanks,
    JN

    Hi,
    The optimization is done in its own thread and should not cause pauses in your application.
    The probable cause for long pause times is garbage collection. Enable verbose output for gc (-Xverbose:memory), or use the JRockit Management Console to monitor your application. Try using the generational concurrent gc, -Xgc:gencon, or the gc strategy -Xgcprio:pausetime. Read more on: http://edocs.beasys.com/wljrockit/docs142/userguide/memman.html
    If you allocate a lot of small, shortlived objects, you might want to configure a nursery where small objects are allocated. When the nursery is full, only that small part of the heap is garbage collected at a smaller cost than scanning the whole heap.
    If you need help tuning your application, you could make a JRA recording and send to us: http://e-docs.bea.com/wljrockit/docs142/userguide/jra.html.
    Good luck,
    Cecilia
    BEA WebLogic JRockit

  • Relaunch and optimize taking forever - can I get some realtime help?

    I have a large fast system running Vista 64 with 8GB Ram. Suddenly Outlook started running really slow. So I loaded relaunce and optimize. Optimization has been running for more than a half hour using 95 -100% memory. However, only 3 to 5% of CPU is ever used.
    Is this normal?
    What will happen to my catelog if I reset the system?
    As soon as I started the optimization I realized that I did not backup the catelog  before I started it.
    Jim Groan

    The optimization finished - It took almost an hour and used virtually all memory during that time. since there was nothing else going on in the system I still wonder if this should be considered a normal behavior
    Jim

  • Analyze and fix not available in project timeline, fcpx

    When I imported media into my event, I did not select Analyze and Fix but now want to use this function.  Analyze and Fix is available when I have a clip selected  the events library, but Analyze and Fix is grayed out and not available when I select a clip in my project timeline.  According to a lynda.com tutorial, this feature should be available in the timeline.  Does anyone know why or am I doing something wrong?  Thanks.

    Hmm...looks like my shots didn't come through.  Let me try again using Insert Image.

  • Content analyzer and browser

    Hi,
    What is Content analyzer and browser and what does the Tcode : RSBICA does...
    Regards,

    Martin
    I used the same document from help.sap.com to configure the setting. I am not sure what needs to be done. I  ran the transactions RSBICA, but no data is shown. I have loaded the DSOs and no data could be extracted.
    Thanks
    Latha

  • Problem Modify and optimize an application

    We have problem using modify application and optimize application on a specific application.
    The application name is GPFormat.
    When we do a modify application and choose "Reassign SQL Index" we get the following errror message:
    Error message:: Cannot drop the index 'dbo.tblFACTGPFormat.IX_tblFACTGPFormat', because it does not exist or you do not have permission.
    And if we do an optimize application with "Full Optimize" and "Compress database" we get the following errror message:
    Error message:There is already an object named 'CONSTTBLFACTGPFORMAT' in the database.
    Could not create constraint. See previous errors.
    We using BPC 5,1 SP5 and SQL2005

    It seems a previous run of optimize with compress was failing.
    So you have to rename CONSTTBLFACTGPFORMAT table and make sure that installation user
    of SAP BPC has the correct access regarding table tblFactGPFFormat.
    Are you using custom indexes for this table?
    I suggest to drop the existing cluster index for table tblFactGPFormat and after you have to run another optimize with compress. This should fix all your problems.
    Regards
    Sorin Radulescu

  • Analyze and fix hanging at 100%

    Hi all,
    I have a project with some 10 hours of footage, including about 5 clips that are about an hour long.
    I forgot to check "analyze and fix" on import and am trying to do so now.
    The problem seems to be that whenever the first clip reaches 100% it just hangs there and does not begin the next clip. On checking in the inspector, the first clip does show itself to be analyzed (video only btw.) The only way I can get it to analyze the next clip is to (force) quit FCPX.
    This is abviously v frustrating as I want to select all and leave everything to analyze overnight.
    Any ideas?
    T

    Flakey install DVD?
    Patrick

  • [svn:osmf:] 12659: A few code cleanup and optimization tasks for the Manifest.

    Revision: 12659
    Revision: 12659
    Author:   [email protected]
    Date:     2009-12-08 10:56:56 -0800 (Tue, 08 Dec 2009)
    Log Message:
    A few code cleanup and optimization tasks for the Manifest.
    Modified Paths:
        osmf/trunk/framework/MediaFramework/org/osmf/net/F4MLoader.as
        osmf/trunk/framework/MediaFramework/org/osmf/net/ManifestParser.as

    Many thanks for the fast reply.
    I've got a follow up question.
    What will happen if I modify the reconnect Code in the OSMF Netloader Class as recommended and then load multiple third party OSMF plugins,
    which may have included the origin OSMF version of the Netloader class.
    Which one will be used at runtime?
    Thanks in advance!

  • User profiles for analyzer and query designer

    Hello,
    I have one short question about user-profiles.
    Which user-profile should I best use for a user, who should just use the BEx Analyzer and which profile does a user need for using BEx Analyzer + BEx Query Designer.
    Thanks

    I just need the SAP-Roles which allows a user to use Analyzer and Query Designer.

  • How to Analyze and Fix?

    I'm trying to do an Analysis of Power Line Hum and then fix.
    I open STP, a multi track setup appears, I drag my file onto one of the tracks (though it's stereo it only appears on one track--odd), when I click on it it appears in the Film Editor window. I select the whole length of the file there, in the Analysis tab I check Power Line Hum and Analyze, it does, but everything at the bottom stays grayed out.
    How does one fix things using the Analysis tab?
    Thanks!

    Thanks for the tips, though I looked for Send and I don't see it.
    And if I do figure out how to do it--is 2x clicking get you the same thing--I believe I was in an audio project when I first tried to analyze and fix and all fix options were grayed out then so how will going there now change anything?
    Sorry, don't have the manual, but an audio file project is just where you have a big window on top and the Film Editor, etc tabs are blank, right?
    STP is a very difficult app that doesn't seem to follow a logic I can comprehend.

Maybe you are looking for