Questions regarding risk register, resource pool, reports and KPI

I am going on a requirement gathering activity today about a new MS Project Server project. I have knowledge of SharePoint but my knowledge of Project Server is very limited. Following will be my topic of discussion with client. Can you please tell me what
kind of questions can I ask about them? I want to gather as much requirement as possible from client.
Some of the question I can think of are:
1. What fields will be there in each register?
2. What kind of permission do you want on each register? For e.g. will each register will be publicly viewable to all users or not?
3. How many KPIs and Dashboards do you need?
Topics
• Project Register
• Issue Register
• Risk Register
• Decision Register
• Invoice Register
• Change Request Register
• Deliverable Register
• Resource Pool
• Reports and KPIs

As You know Project Server sits on SharePoint. If you know which version of Project server your client want.
Then you can ask some question(I am giving you list of few question) like:
1. Approx No of users 
group and security related questions like which kind of permission they want for Project manager, team member, resource manager etc.
2 Approx no of Project per year
3. Avg no of task per project
4. Project level Custom field
5. Task level Custom Field
6. Resource Level Custom field
7. Project, resource or task level of custom view 
8. Any demand Management workflow
9. back up and restore strategy.
10. any Issue or risk associated with project and mitigation plans
11. drivers or strategy for Portfolio management 
12. If any change need to be done in EPM environment how you guys can handle it. 
13. for Delivery you can split your project in phases in which first phase will contain installation and basic configuration  then report development, then Custom development, then Training  and support
14. which kind of report they want (SSRS, Excel baed, Performance point, dashboard, Power pivot)
15. for reports what would  be the KPI 
16. Tracking method, Timsheet usage, status Reports
17. whether they will use default Project site or customized project site 
18. Enterprise Project template for Business unit.
19. Department specific reports.
20 .for infra you can ask (Redundancy, server performance high availability )
etc.
Please group all these as per our need.
kirtesh

Similar Messages

  • Question regarding Calculated Key Figures in BEx and their impact on SQL

    Hello,
    I am new to BO SAP integration. I have a question regarding using CKF in BEx.
    I created universe off of a BEx query with no CKF. I then created a Webi report with come dimensions and measures. I captured the SQL generated using trace (ST05).
    In the same BEx, I then create a CKF. Then refreshed the universe and created a new Webi report using the same dimensions and the CKF. The SQL generated had many more select statements.
    My question is what is the effect of CKF on the generated SQL and is there a performance issues using CKF in BEx as opposed to creating variables in Webi report?
    Thanks,
    Nikhil

    Hi,
    if your CKF will have always same unit and you have one KF in you inforpovider with this unit, you can try to do this trick
    create a new hidden CKF as new CKF = KF / KF (with this equale new CKF = 1 unit)
    change your old CKF as old CKF = old CKF * new CKF
    let me know if it works.

  • Basic question regarding Contract data exchange between CRM and ERP

    Hi,
    1) At a very general level, can somebody tell me via what technology are contract documents exchanged between CRM and ERP? IDocs, RFC modules or proxies?
    2) Is it possible to sychronise changes in a contract, e.g. if the contract is changes in the ERP system, those changes are auotomatically updated in the CRM contract?
    3) Can somebody point me to any documentation on this?
    Kind Regards,
    Tony.

    contracts are exchanged between CRM ans ERP ssytem ny RFC and FM , you use object SALESCONTRACT for this,it is possible to synchronize changes between both the systems

  • A questions regarding Raid 0 version 1.0 and upgrading to Leopard OS

    I am trying to upgrade my powerpc Mac Pro G5 from Tiger OS to Leopard. My G5 has two hard drives that are configured as a striped Raid 0 set version 1. The Leopard installation stops every time after I choose to install the OS on the Raid volume with the following message "The installer cannot prepare the volume for installation. You may need to prepare the disk using Disk Utility". I have tried using Disk Utility to unmount the set as well as unsuccessfully trying to convert the Raid set from version 1.0 to 2.0. The drives test good, the Mac works fine.
    Does anyone know if Leopard will install on a Raid 0 set version 1 or is my only option to break the Raid set, re-partition each drive and then try to upgrade?
    Any help would be greatly appreciated. Thanks, Alan.

    I'm not sure which install option you've chosen, but here is something you might try:
    Repairing the Hard Drive and Permissions
    Boot from your Leopard Installer disc. After the installer loads select your language and click on the Continue button. When the menu bar appears select Disk Utility from the Utilities menu. After DU loads select your hard drive entry (mfgr.'s ID and drive size) from the the left side list. In the DU status area you will see an entry for the S.M.A.R.T. status of the hard drive. If it does not say "Verified" then the hard drive is failing or failed. (SMART status is not reported on external Firewire or USB drives.) If the drive is "Verified" then select your OS X volume from the list on the left (sub-entry below the drive entry,) click on the First Aid tab, then click on the Repair Disk button. If DU reports any errors that have been fixed, then re-run Repair Disk until no errors are reported. If no errors are reported click on the Repair Permissions button. Wait until the operation completes, then quit DU and return to the installer.
    If DU reports errors it cannot fix, then you will need Disk Warrior and/or Tech Tool Pro to repair the drive. If you don't have either of them or if neither of them can fix the drive, then you will need to reformat the drive and reinstall OS X.
    Continue to the window where you select the target volume and click once on your RAID volume to select it, then click on the Options button. Select the Archive and Install option then click on the OK button. Proceed with the installation.
    If you want to preserve your Users folder and network preferences then be sure to check the box for that option with the Archive and Install option. This will automatically transfer all your user accounts and your network preferences.
    Upon completion you will have a Previous Systems Folder at the root directory of your hard drive. You may need to keep the old system in order to transfer any third-party installed items in the /Library/Applications Support/ and /Library/Preferences/ folders. These items can be moved manually from those folders in the PSF to the corresponding locations in the newly installed system.

  • LR3 - Questions regarding virtual copies with different exposures and Merge to HDR Pro in Photoshop

    Hello,
    I'm currently testing LR3. I have a master copy and 4 virtual copies with different exposures selected in LR3. I tought to myself that I could simply use the "Edit / In Merge to HDR Pro in Photoshop" option and tune the resulting HDR image in Photoshop but it seems that Photoshop doesn't take in consideration the exposure changes I've made on the virtual copies in LR3. The 5 images are identifical in the HDR Pro tool.
    Is there something I can do to about it ? Maybe there is a step I've forgot in my workflow or something ?
    Maybe I could simply export the master and virtual copies to JPEGs and then import these in the Photoshop HDR Pro tool but I don't think I would get the same results right ?
    Thank you for your help.

    Depending on the image ISO and overall content, you can get some pleasing DR pickup using this routine over simply hitting the fill light slider up and the exposure slider down.  I never really liked the PS approach since it seemed to require too much tweaking inside PS.  I have experimented with LR/Enfuse and Photomatix plug-ins and have settled on Photomatix for my HDR exposures (both "real" with multiple images and "fake" with a single image as you have described).  I think LR/Enfuse is donation ware so you might want to start there.  In both cases, the resultant image still needs a little adjusting back in LR after it is automatically imported into the Library.
    Jeff

  • Question regarding all the x58 Pro-e and memory problems

    I've notice a serious amount of issues when it comes to people not seeing all of their RAM or not getting triple-channel status.
    Before I even post my issue, I was wondering if there is a general fix for everyone experiencing these problems, because it seems I'm have the exact same issue as everyone else.

    Oh well. Its basically like I said - its the same problem most other people are having. I have 6 GB of DDR3 RAM but only 4 GB ever shows up. And it shows in Dual channel mode, when everything should be triple channel 3 x 2GB. First, here are the specs for my system, which was just recently purchased.
    Model of the motherboard: MSI X58 Pro-E LGA
    Model of the memory: G.SKILL 6GB (3 x 2GB) 240-Pin DDR3 SDRAM DDR3 1333 (PC3 10666) Triple Channel Kit
    Model of the CPU: Intel Core i7-930 Bloomfield 2.8GHz LGA 1366 130W
    Model of the GPU:  SAPPHIRE TOXIC 100282TXSR Radeon HD 5850 1GB 256-bit GDDR5
    Model of the HDD:  Western Digital Caviar Black WD1001FALS 1TB 7200 RPM SATA 3.0Gb/s
    Model of the PSU: CORSAIR CMPSU-750TX 750W ATX12V / EPS12V
    Type of Operating System: Windows 7 Home Premium 64-bit
    I have tried everything. I've updated the BIOS, I've tested each stick, I've even RMA'd the board itself, but to no avail. I have 3 sticks of RAM in each of the DIMM_A slots. Each stick is 2 GB each, but the POST never counts more than 4 GBs. I've tried adding in the sticks each one at a time in different slots; in different combinations, but nothing is working. I know the sticks work and I know the board is reading them because they show up in Memory-Z in the BIOS and in fact, in a 3rd party program called Speccy, it shows 6GB! But it still says they're running Dual channel mode.
    Here is the current BIOS I am running as shown in the POST output and model of the memory as shown on the sticker AND the model of the motherboard itself:
    BIOS: A7522IMS V8.10 030910
    RAM MODEL NUMBER: F3-10666CL9T-6GBNQ | DDR3-1333 CL 9-9-9-24 1.5v | PC3-10666 2GBX3 | 10060840076833
    MOBO MODEL NUMBER: MS-7522 | VER: 3.1
    Those are exactly what are printed on the label/sticker letter-for-letter.
    Quote from: HU16E on 04-April-10, 09:18:47
    Buy the correct RAM, don't bend any CPU socket pins, install the sticks in the correct slots, don't buy into the RAM makers claims of 'rated' performance.
    You're either implying that I've incompetently bent a socket pin when I installed the CPU, placed the RAM in DIMM slots they don't belong or need to trash the $200 RAM I've just purchased.
    Ok, assuming that ISN'T the case, what exactly are you suggesting thats supposed to be of help? You say "Buy the correct RAM" but you don't tell me what the 'correct' RAM is.

  • Active Directory Enterprise Resource Pool Synchronization

    I'm running in to an issue with Project Server 2013 that has me pulling my hair out.  A co-worker and I switched it over from SharePoint Permissions Mode to Project Permissions Mode and are trying to set up AD Enterprise Resource Pool synchronization. 
    I go in to Server Settings --> Operational Policies --> AD Resource Pool Synch and try to add 1 to 5 groups.  The groups resolve (actually autocomplete) and appear correctly.  If iclick Save or Save & Synchronize Now, nothing happens. 
    No page refresh, no redirect, etc.  If I go in to Security Settings --> Manage Groups, I can edit one of the built-in PWA groups and add the AD group, click save, and get redirected back to Server Settings.  But then going back in to the newly
    edited group, the AD Group field is empty; and on the Manage Groups page the "Active Directory Group" and  "Last Sync" columns are empty.
    I would chalk this up to an AD issue, but I have no problem using the same account to add an individual user as an Enterprise Resource and tie it to an AD account, so I can only assume that this account has all of the permissions required for AD and it can
    query and resolve fine.  Does anyone have any idea what could be going on, or possibly some step I might have missed when setting this up?
    Thanks in advance.

    I know it looks like a different issue, but that is the only thing logged in ULS regarding Project Server at the time of trying to do the sync.  If it helps, I can provide another log and see if something else comes up, but I did watch the ULS Viewer
    at the time and the same errors kept popping up when trying to do the sync.
    Also, the "other environment" is just a second instance of PWA on the same SharePoint farm, so it's not a separate environment completely.  Using the same AD groups, so there shouldn't be any discrepancies.  It's not the best test I know,
    but I'm still in the middle of setting up an AD test environment and trying it from there, just to see if it's an issue with my SharePoint installation or PWA installation.
    Edit:
    Here's a full verbose log:
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Security    agsdb    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: Calling license check now; Thread.CurrentPrincipal.Identity.Name = 0#.w|*****\sharepointfarmsvc, Context.Username = i:0#.w|*****\sharepointfarmsvc, LogLevelManager Warning-ulsID:0x001920C1
    has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Security    agsdc    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: License check for BusinessObject method Admin.GetActiveDirectorySyncEnterpriseResourcePoolSettings2 passed - user i:0#.w|*****\sharepointfarmsvc (resuid=806f917d-85b6-e211-93ef-00155d0a5600)
    was licensed, LogLevelManager Warning-ulsID:0x001920C2 has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Security    agwmc    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: Access Granted for user i:0#.w|*****\sharepointfarmsvc on BusinessObject method Admin.GetActiveDirectorySyncEnterpriseResourcePoolSettings2.  CheckFarmAdmin=False, CheckGlobalPerms=True,
    LogLevelManager Warning-ulsID:0x00196302 has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    General    ad2n8    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: Running command using non-transactioned Sql session., LogLevelManager Warning-ulsID:0x000DC362 has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Administration    agmy5    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: GetActiveDirectorySyncEnterpriseResourcePoolSettings2: Settings retrieved, Status=NeverRun, LastUpdateTime=01/01/0001 00:00:00, ScheduledUpdates=False, ADGroupGuidCount=0, LogLevelManager
    Warning-ulsID:0x0018C61F has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Security    agsdb    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: Calling license check now; Thread.CurrentPrincipal.Identity.Name = 0#.w|*****\sharepointfarmsvc, Context.Username = i:0#.w|*****\sharepointfarmsvc, LogLevelManager Warning-ulsID:0x001920C1
    has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Security    agsdc    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: License check for BusinessObject method Admin.ResolveActiveDirectoryGroups passed - user i:0#.w|*****\sharepointfarmsvc (resuid=806f917d-85b6-e211-93ef-00155d0a5600) was licensed, LogLevelManager
    Warning-ulsID:0x001920C2 has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Security    agwmc    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: Access Granted for user i:0#.w|*****\sharepointfarmsvc on BusinessObject method Admin.ResolveActiveDirectoryGroups.  CheckFarmAdmin=False, CheckGlobalPerms=True, LogLevelManager Warning-ulsID:0x00196302
    has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Active Directory Sync    af8v6    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: ResolveActiveDirectoryGroups; resolving encodedClaims=c:0+.w|s-1-5-21-2100168755-2748282862-3037512304-1280, LogLevelManager Warning-ulsID:0x00162560 has no entities explicitly specified.  
     c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    General    ad2n8    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: Running command using non-transactioned Sql session., LogLevelManager Warning-ulsID:0x000DC362 has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Administration    agmy5    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: GetActiveDirectorySyncEnterpriseResourcePoolSettings2: Settings retrieved, Status=NeverRun, LastUpdateTime=01/01/0001 00:00:00, ScheduledUpdates=False, ADGroupGuidCount=0, LogLevelManager
    Warning-ulsID:0x0018C61F has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Active Directory Sync    af8vj    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: Entered ActiveDirectoryUtility::TryGetGroupInfoByClaim, LogLevelManager Warning-ulsID:0x00162549 has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.19    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Active Directory Sync    alv2g    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: Querying for fqdn/netbios mapping of server *****, LogLevelManager Warning-ulsID:0x002D5706 has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.20    w3wp.exe (0x3BE0)    0x0DDC    Project Server    Active Directory Sync    af8vl    Verbose    PWA:http://portal2013/PWA, ServiceApp:Project
    Server Service Application, User:i:0#.w|*****\sharepointfarmsvc, PSI: Entered ActiveDirectoryUtility::TryGetGroupInfo, LogLevelManager Warning-ulsID:0x0016254B has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f
    09/11/2013 10:31:49.20    w3wp.exe (0x3BE0)    0x0DDC    Project Server    General    aeby8    Medium    Error is: GeneralUnhandledException. Details: General
    Unhandled Exception in _Admin.ResolveActiveDirectoryGroups_ Attributes:  System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.     at Microsoft.Office.Project.Server.ClaimsHelper.GetFormsAuthenticationProviderName(Uri
    context, SPUrlZone zone)     at Microsoft.Office.Project.Server.BusinessLayer.ActiveDirectoryUtility.TryGetGroupInfo(String domain, String searchPath, String searchFilter)     at Microsoft.Office.Project.Server.BusinessLayer.ActiveDirectoryUtility.TryGetGroupInfoByClaim(String
    encodedClaim)     at System.Linq.Enumerable.WhereSelectArrayIterator`2.MoveNext()     at System.Linq.Buffer`1..ctor(IEnumerable`1 source)     at System.Linq.Enumerable.ToArray[TSource](IEnumerable`1
    source)     at Microsoft.Office.Project.Server.Wcf.Implementation.PWAImpl.<>c__DisplayClass10a.<AdminResolveActiveDirectoryGroups>b__109()     at Microsoft.Office.Project.Server.Wcf.Implementation.WcfMethodInvocation.InvokeBusinessObjectMethod(String
    businessObjectName, String methodName, IEnumerable`1 actions)  . Standard Information: PSI Entry Point:  Project User: i:0#.w|*****\sharepointfarmsvc Correlation Id: da702be3-ee1a-e311-93f8-00155d0a5600 PWA Site URL: http://portal2013/PWA SA Name:
    Project Server Service Application PSError: GeneralUnhandledException (42), LogLevelManager Warning-ulsID:0x00101622 has no entities explicitly specified.    c4f9419c-7789-a0b9-3d4a-941ceb42b25f

  • Difference in Amount between BI Report and GL Account

    Hi,
    Regarding the difference between BI report and GL,
    We found duplicated records in the data source 0FI_GL_4.
    (The difference was in an another account which did not match with the GL.)
    1.Account: 123456
      G/L Balance: $430,382.41
      BI Report:     $432,211.10
      Difference:     +$1,828.69
    u21D2In data source, there are a duplicated record for each of two accounting document below.
    Acct doc. 1900000900 item#2    $2,623.64
    Acct doc. 1900000901 item#2      -$794.95
    2.Account: 7891011
      G/L Balance: $92,301.92
      BI Report:     $78,341.42
      Difference:  -$13,960.50
    u21D2On data source, there are a duplicated record for one accounting document below.
    Acct doc. 1900000902 item#2   $13,960.50
    What could be the cause of this? How can I solve this issue?
    Thanks

    Hi,
    Since I already know where and what the duplicate amounts are in the PSA, Can I do the following:
    1. Delete the Amounts (and change them to 0) in the PSA
    2. Delete that particular load from the Cube
    3. Now that the duplicate amounts dont exist in the PSA, Re-Load it back to the Cube using DTP
    Would this work? Is this even possible?
    Thanks

  • Question regarding DocumentDB RU consumption when inserting documents & write performance

    Hi guys,
    I do have some questions regarding the DocumentDB Public Preview capacity and performance quotas:
    My use case is the following:
    I need to store about 200.000.000 documents per day with a maximum of about 5000 inserts per second. Each document has a size of about 200 Byte.
    According to to the documentation (http://azure.microsoft.com/en-us/documentation/articles/documentdb-manage/) i understand that i should be able to store about 500 documents per second with single inserts and about 1000 per second with a batch insert using
    a stored procedure. This would result in the need of at least 5 CUs just to handle the inserts.
    Since one CU consists of 2000 RUs i would expect the RU usage to be about 4 RUs per single document insert or 100 RUs for a single SP execution with 50 documents.
    When i look at the actual RU consumption i get values i don’t really understand:
    Batch insert of 50 documents: about 770 RUs
    Single insert: about 17 RUs
    Example document:
    {"id":"5ac00fa102634297ac7ae897207980ce","Type":0,"h":"13F40E809EF7E64A8B7A164E67657C1940464723","aid":4655,"pid":203506,"sf":202641580,"sfx":5662192,"t":"2014-10-22T02:10:34+02:00","qg":3}
    The consistency level is set to “Session”.
    I am using the SP from the example c# project for batch inserts and the following code snippet for single inserts:
    await client.CreateDocumentAsync(documentCollection.DocumentsLink, record);
    Is there any flaw in my assumption (ok…obviously) regarding the throughput calculation or could you give me some advice how to achieve the throughput stated in the documentation?
    With the current performance i would need to buy at least 40 CUs which wouldn’t be an option at all.
    I have another question regarding document retention:
    Since i would need to store a lot of data per day i also would need to delete as much data per day as i insert:
    The data is valid for at least 7 days (it actually should be 30 days, depending on my options with documentdb). 
    I guess there is nothing like a retention policy for documents (this document is valid for X day and will automatically be deleted after that period)?
    Since i guess deleting data on a single document basis is no option at all i would like to create a document collection per day and delete the collection after a specified retention period.
    Those historic collections would never change but would only receive queries. The only problem i see with creating collections per day is the missing throughput:
    As i understand the throughput is split equally according to the number of available collections which would result in “missing” throughput on the actual hot collection (hot meaning, the only collection i would actually insert documents).
    Is there any (better) way to handle this use case than buy enough CUs so that the actual hot collection would get the needed throughput?
    Example: 
    1 CU -> 2000 RUs
    7 collections -> 2000 / 7 = 286 RUs per collection (per CU)
    Needed throughput for hot collection (values from documentation): 20.000
    => 70 CUs (20.000 / 286)
    vs. 10 CUs when using one collection and batch inserts or 20 CUs when using one collection and single inserts.
    I know that DocumentDB is currently in preview and that it is not possible to handle this use case as is because of the limit of 10 GB per collection at the moment. I am just trying to do a POC to switch to DocumentDB when it is publicly available. 
    Could you give me any advice if this kind of use case can be handled or should be handled with documentdb? I currently use Table Storage for this case (currently with a maximum of about 2500 inserts per second) but would like to switch to documentdb since i
    had to optimize for writes per second with table storage and do have horrible query execution times with table storage because of full table scans.
    Once again my desired setup:
    200.000.000 inserts per day / Maximum of 5000 writes per second
    Collection 1.2 -> Hot Collection: All writes (max 5000 p/s) will go to this collection. Will also be queried.
    Collection 2.2 -> Historic data, will only be queried; no inserts
    Collection 3.2 -> Historic data, will only be queried; no inserts
    Collection 4.2 -> Historic data, will only be queried; no inserts
    Collection 5.2 -> Historic data, will only be queried; no inserts
    Collection 6.2 -> Historic data, will only be queried; no inserts
    Collection 7.2 -> Historic data, will only be queried; no inserts
    Collection 1.1 -> Old, so delete whole collection
    As a matter of fact the perfect setup would be to have only one (huge) collection with an automatic document retention…but i guess this won’t be an option at all?
    I hope you understand my problem and give me some advice if this is at all possible or will be possible in the future with documentdb.
    Best regards and thanks for your help

    Hi Aravind,
    first of all thanks for your reply regarding my questions.
    I sent you a mail a few days ago but since i did not receive a response i am not sure it got through.
    My main question regarding the actual usage of RUs when inserting documents is still my main concern since i can not insert nearly
    as many documents as expected per second and CU.
    According to to the documentation (http://azure.microsoft.com/en-us/documentation/articles/documentdb-manage/)
    i understand that i should be able to store about 500 documents per second with single inserts and about 1000 per second with a batch insert using a stored procedure (20 batches per second containing 50 documents each). 
    As described in my post the actual usage is multiple (actually 6-7) times higher than expected…even when running the C# examples
    provided at:
    https://code.msdn.microsoft.com/windowsazure/Azure-DocumentDB-NET-Code-6b3da8af/view/SourceCode
    I tried all ideas Steve posted (manual indexing & lazy indexing mode) but was not able to enhance RU consumption to a point
    that 500 inserts per second where nearly possible.
    Here again my findings regarding RU consumption for batch inserts:
    Automatic indexing on: 777
    RUs for 50 documents
    Automatic indexing off &
    mandatory path only: 655
    RUs for 50 documents
    Automatic indexing off & IndexingMode Lazy & mandatory path only:  645 RUs for
    50 documents
    Expected result: approximately 100
    RUs (2000 RUs => 20x Batch insert of 50 => 100 RUs per batch)
    Since DocumentDB is still Preview i understand that it is not yet capable to handle my use case regarding throughput, collection
    size, amount of collections and possible CUs and i am fine with that. 
    If i am able to (at least nearly) reach the stated performance of 500 inserts per second per CU i am totally fine for now. If not
    i have to move on and look for other options…which would also be “fine”. ;-)
    Is there actually any working example code that actually manages to do 500 single inserts per second with one CUs 2000 RUs or is
    this a totally theoretical value? Or is it just because of being Preview and the stated values are planned to work.
    Regarding your feedback:
    ...another thing to consider
    is if you can amortize the request rate over the average of 200 M requests/day = 2000 requests/second, then you'll need to provision 16 capacity units instead of 40 capacity units. You can do this by catching "RequestRateTooLargeExceptions" and retrying
    after the server specified retry interval…
    Sadly this is not possible for me because i have to query the data in near real time for my use case…so queuing is not
    an option.
    We don't support a way to distribute throughput differently across hot and cold
    collections. We are evaluating a few solutions to enable this scenario, so please do propose as a feature at http://feedback.azure.com/forums/263030-documentdb as this helps us prioritize
    feature work. Currently, the best way to achieve this is to create multiple collections for hot data, and shard across them, so that you get more proportionate throughput allocated to it. 
    I guess i could circumvent this by not clustering in “hot" and “cold" collections but “hot" and “cold"
    databases with one or multiple collections (if 10GB will remain the limit per collection) each if there was a way to (automatically?) scale the CUs via an API. Otherwise i would have to manually scale down the DBs holding historic data. I
    also added a feature requests as proposed by you.
    Sorry for the long post but i am planning the future architecture for one of our core systems and want to be sure if i am on
    the right track. 
    So if you would be able to answer just one question this would be:
    How to achieve the stated throughput of 500 single inserts per second with one CUs 2000 RUs in reality? ;-)
    Best regards and thanks again

  • Summary Report question regarding data that is included and data that is not.

    I have a survey that is running at the momemt with about 15% of the expected responses(250-300) which I'll gather over the next couple of weeks. When I check the current Summary Report it omits several of the questions and answers shown in the current Analyze table. Is this to be expected? I really like the Summary Report and want my co-chairs to be able to enter when they have time to glance at the current results via the Summary Report tab.
    Thank you
    Carol

    Sorry that you're having trouble Carol. In the Summary Report tab, are you able to add report tiles for fields without tiles by using the Insert menu?

  • Basic question regarding SSIS 2010 Package where source is Microsoft Excel 97-2005 and there is no Microsoft office or Excel driver installed in Production

    Hi all,
    I got one basic question regarding SSIS 2010 Package where source is Microsoft Excel 97-2005. I wanted to know How this package works in production where there is no Microsoft office or Excel driver installed. To check that there is excel driver installed
    or not, I followed steps: Start-->Administrative Tools--> Data Sources(ODBC)-->Drivers and I found only 2 drivers one is SQL Server and another one is SQL Server Native Client 11.0.
    Windows edition is Windows Server 2008 R2 Enterprise, Service Pack-1 and System type is 64-bit Operating System.
    We are running this package from SQL Server Agent and using 32-bit (\\Machine_Name\d$\Program Files (x86)\Microsoft SQL Server\110\DTS\Binn\DTExec.exe /FILE "\\Machine_Name\d$\ Folder_Name\EtL.dtsx" /CONFIGFILE "\\Machine_Name\d$\Folder_Name\Config.dtsConfig"
    /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING E) to run this package. I opened the package and tried to find out what connection we have used and found that we have used "Excel Connection Manager" and ConnectionString=Provider=Microsoft.Jet.OLEDB.4.0;Data
    Source=F:\Fares.xls;Extended Properties="EXCEL 8.0;HDR=YES"; and source is ‘Excel Source’
    I discussed with my DBA and He said that SSIS is having inbuilt Excel driver but I am not convinced.
    Could anyone please clear my confusion/doubt?
    I have gone through various links but my doubt is still not clear.
    Quick Reference:
    SSIS in 32- and 64-bits
    http://toddmcdermid.blogspot.com.au/2009/10/quick-reference-ssis-in-32-and-64-bits.html
    Why do I get "product level is insufficient..." error when I run my SSIS package?
    http://blogs.msdn.com/b/michen/archive/2006/11/11/ssis-product-level-is-insufficient.aspx
    How to run SSIS Packages using 32-bit drivers on 64-bit machine
    http://help.pragmaticworks.com/dtsxchange/scr/FAQ%20-%20How%20to%20run%20SSIS%20Packages%20using%2032bit%20drivers%20on%2064bit%20machine.htm
    Troubleshooting OLE DB Provider Microsoft.ACE.OLEDB.12.0 is not registered Error when importing data from an Excel 2007 file to SQL Server 2008
    http://www.mytechmantra.com/LearnSQLServer/Troubleshoot_OLE_DB_Provider_Error_P1.html
    How Can I Get a List of the ODBC Drivers that are Installed on a Computer?
    http://blogs.technet.com/b/heyscriptingguy/archive/2005/07/07/how-can-i-get-a-list-of-the-odbc-drivers-that-are-installed-on-a-computer.aspx
    Thanks Shiven:) If Answer is Helpful, Please Vote

    Hi S Kumar Dubey,
    In SSIS, the Excel Source and Excel Destination natively use the Microsoft Jet 4.0 OLE DB Provider which is installed by SQL Server. The Microsoft Jet 4.0 OLE DB Provider deals with .xls files created by Excel 97-2003. To deal with .xlsx files created by
    Excel 2007, we need the Microsoft ACE OLEDB Provider. SQL Server doesn’t install the Microsoft ACE OLEDB Provider, to get it we can install the
    2007 Office System Driver: Data Connectivity Components or
    Microsoft Access Database Engine 2010 Redistributable or Microsoft Office suit.
    The drivers listed in the ODBC Data Source Administrator are ODBC drivers not OLEDB drivers, therefore, the Excel Source/Destination in SSIS won’t use the ODBC driver for Excel listed in it by default. On a 64-bit Windows platform, there are two versions
    of ODBC Data Source Administrator. The 64-bit ODBC Data Source Administrator is C:\Windows\System32\odbcad32.exe, while the 32-bit one is C:\Windows\SysWOW64\odbcad32.exe. The original 32-bit and 64-bit ODBC drivers are installed by the Windows operating system.
    By default, there are multiple 32-bit ODBC drivers and fewer 64-bit ODBC drivers installed on a 64-bit platform. To get more ODBC drivers, we can install the 2007 Office System Driver: Data Connectivity Components or Microsoft Access Database Engine 2010 Redistributable.
    Besides, please note that 2007 Office System Driver: Data Connectivity Components only install 32-bit ODBC and OLEDB drivers because it only has 32-bit version, but the Microsoft Access Database Engine 2010 Redistributable has both 32- bit version and 64-bit
    version.
    If you have any questions, please feel free to ask.
    Regards,
    Mike Yin
    TechNet Community Support

  • Unable to register Hyperion Reporting and Analysis with Shared Services

    Hello everybody,
    I'm trying to install and configure Foundation Services in order to schedule Brio Query documents execution and distribution.
    IBM DB2 Express has been installed and configured.
    Shared Services have been installed and configured.
    Reporting and Analysis - System 9 Services have been installed and not yet configured.
    Reporting and Analysis - System 9 UI Services have been installed and not yet configured.
    When proceeding with Hyperion Reporting and Analysis configuration in registering with Shared Services, provided with the same host, port, user profile and password used for configuring Shared Services, I get the following message:
    "An unknown CSS error occured.
    Please check that the information entered is correct."
    Messages from configtool.log file:
    "(Aug 21, 2008, 09:32:32 AM), com.hyperion.cis.config.CmsRegistrationUtil, DEBUG, Getting CSS instance
    (Aug 21, 2008, 09:32:32 AM), com.hyperion.cis.config.CmsRegistrationUtil, DEBUG, URL for CSS.xml = https://itblq001.wincor-nixdorf.com:58080/interop/framework/getCSSConfigFile
    (Aug 21, 2008, 09:32:32 AM), com.hyperion.cis.config.CmsRegistrationUtil, DEBUG, Authenticate LStoppa user to CSS.
    (Aug 21, 2008, 09:32:32 AM), com.hyperion.cis.config.CmsRegistrationUtil, ERROR, Failed to authenticate user = LStoppa"
    I can't understand what's happening. May you help me?
    Thank you for every hints.
    Regards
    Lucia

    The service runs until I try to confirm the registration while configuring HRA. Then it stops.
    Apologies for the simple question, as I am trying to make the whole Hyperion System 9 running with little knowledge of it: how do I log to shared services?
    Thanks

  • How does failover and load distribution work in Operations Manager resource pools?

    Hello everyone,
    I have a couple of questions in regard to failover and load distribution in resource pools.
    What is the criteria for an x-plat or network device to initiate a failover?
    How is load distributed in a resource pool - Is it round-robin, performance based or something completely different?
    In case more than two management servers are in the same pool, and a management server becomes unavailable, how is load distributed among the remaining resource pool members?
    Any help is appreciated.
    Thx,
    Claus

    Hi,
    Resource Pools are a collection of Health Services working together to manage instances assigned to the pool. 
    Workflows targeted to the instances are loaded by the Health Service in the Resource Pool that ends up managing that instance. 
    If one of the Health Services in the Resource pool were to fail, the other Health Services in the pool will pick up the work that the failed member was running. 
    And I would like to share this article with you. Hope it helps.
    http://www.systemcentercentral.com/how-does-the-failover-process-work-in-opsmgr-2012-scom-sysctr/
    Niki Han
    TechNet Community Support

  • Difference between a Detail Report and a Register

    Hello all,
    Very basic question.
    What's the difference between a detail report, a summary report and a register report?
    Regards,
    Sam

    Hi,
    Please refer to the reports documentation as it should answer your question.
    Oracle Reports Documentation
    http://www.oracle.com/technetwork/middleware/reports/documentation/reports-087551.html
    Thanks,
    Hussein

  • Question regarding Dashboard and column prompt

    My question regarding Dashboard and column prompt:
    1) Dashboard prompt usually work with only for columns which are in subject area. In my report I've created some of the columns which are based on other columns. Like I've daysNumber column that is based on two other columns, as it calculates the difference of two dates. When I create dashboard prompt I can't find this column there. I need to make a prompt on this column.
    2)For one of the column I've only two values 1 and 0. When I create prompt for this column, is it possible that in drop down list It shows 'Yes' for 1 and 'No' for 0 and still filter the request??

    Hi Toony,...
    I think there was another way of doing this...
    In the dashboard prompt go to Show option > select SQL Results from dropdown.
    There you need to write your Logical SQL like...
    SELECT CASE WHEN 1=0 THEN PERIODS.YEAR ELSE difference of date functionality END FROM SubjectAreaName
    Here.. Periods.Year is the column which is already exists in repository's presentation layer..
    and difference of date functionality is the code or formula of column which you want to show in drop-down...
    Also write the CASE WHEN 1=0 THEN PERIODS.YEAR ELSE difference of date functionality END code in fx of that prompt.
    I think it helps you in doing this..
    Just check and inform me if it works...
    Thanks & Regards
    Kishore Guggilla
    Edited by: Kishore Guggilla on Oct 31, 2008 9:35 AM

Maybe you are looking for

  • My music is in my library but no artist pictures . How can I make the pics come back?

    Help me get pics in my music library ? Thanks

  • CSS opacity: background opaque, but content not..help

    Dear all, It's been a while but I have something I just can't figure out .. what I'm after...I want the background of the main container to be 60% opaque, but the actual content divs not ... can anybody help me with this Thanks #mainWrapper {     wid

  • I've found a way to get to the usage meter

    I'm still getting the 'bundle' error like everyone else trying to check usage, but by checking my phone bill usage and then clicking on the broadband tab, and selecting "view your broadband usage" I get to the usage meter which seems up to date (ie s

  • Win 7 64bit   SP1 Install DVD don't work

    Hi everyone, i try to install windows 7 64bit + sp1(.iso downloaded from msdnaa) von my new iMac(21.5", Mid 2011). I downloaded the .iso from my windows notebook and copy it to my mac. Then i burn it with the preinstalled mac prog on a dvd. but when

  • Low Levels Assistance Please

    Hi Guys I need some help with my new set up Mac G5 version 10.3.9 Logic Pro 7 Audio Interface = motu 828 I don't understand why I am not getting decent levels through logic audio and maybe someone who knows my set up can assist as I am sure it is eas