JetForms processing slow on new Citrix server

JetForms processing on a new Citrix server running Windows 2008 R2 takes twice as long as on the legacy Windows 2003 server.  
Example invoice processing/printing: Legacy = 8-10 seconds New = 16-20 seconds.   
Printing same invoice directly from Adobe Reader takes 5.5 seconds on both servers, so it doesn't appear to be a network issue. 
Any ideas why the latency?  We are currently testing with NO antivirus software installed/enabled on the new server.

Hi Paul -- thanks again for taking time to work with me on this.   I did the DOS test and even copying a 6meg file was practically instantaneous on the new server.
I had previously reported on the 2 lines in the log file that are adding the delay.   The thing is, there are actually 8 total lines in the log file accessing the same network location.   Only 2 of the eight lines are delayed comparing new vs old server.   Here are all of the lines in order as they appear in the log file.  (There are 129 actual entries in the log, but these are the only 8 that access the C: drive location):
03/15/13 10:50:21 C:\Program Files\Adobe\Central\Bin\jfserver.exe: [307]Launching task '"C:\Program Files\Adobe\Central\Bin\jfptool" "C:\jfsrvr\Data\IN1006518.dat" "IN1006518_ds.dat" "C:\jfsrvr\jfmerge.ini" -stripport'.
03/15/13 10:50:21 C:\Program Files\Adobe\Central\Bin\jfserver.exe: [307]Launching task '"C:\Program Files\Adobe\Central\Bin\jfmerge" "*" "C:\jfsrvr\Data\IN1006518_ds.dat" -afxon -apr"" -all"C:\jfsrvr\jfserver.log" -zNUL: -asl1 -amq0 -jmst  -f260,261 -aii"C:\jfsrvr\jfmerge.ini"'.
03/15/13 10:50:21 C:\Program Files\Adobe\Central\Bin\jfmerge: [125]* Processing data file: 'C:\jfsrvr\Data\IN1006518_ds.dat'.
03/15/13 10:50:21 C:\Program Files\Adobe\Central\Bin\jfmerge: [289]MDF file `C:\jfsrvr\forms\INVOICE_USER.mdf' opened.
03/15/13 10:50:22 C:\Program Files\Adobe\Central\Bin\jfserver.exe: [307]Launching task '"C:\Program Files\Adobe\Central\Bin\jfmerge" "*" "C:\jfsrvr\Data\IN1006518_ds.dat" -aspwindows -afxon -apr"" -all"C:\jfsrvr\jfserver.log" -z"\\HF-DCFS.hfe.local\2C-Sacto/EMF=C:\jfsrvr\Data\JfServer.TFC" -asl1 -amq0 -jmst  -aii"C:\jfsrvr\jfmerge.ini" -advglobal:pagecount=1'.
03/15/13 10:50:22 C:\Program Files\Adobe\Central\Bin\jfmerge: [125]* Processing data file: 'C:\jfsrvr\Data\IN1006518_ds.dat'.
... and then the last two that each take 4 seconds longer than on the legacy server:
03/15/13 10:50:22 C:\Program Files\Adobe\Central\Bin\jfmerge: [289]MDF file `C:\jfsrvr\forms\INVOICE_USER.mdf' opened.
03/15/13 10:50:23 C:\Program Files\Adobe\Central\Bin\jfserver.exe: [307]Launching task '"C:\Program Files\Adobe\Central\Bin\uncollate" "C:\jfsrvr\Data\JfServer.TFC" -Z"\\HF-DCFS.hfe.local\2C-Sacto" "-f260,261" -d'.

Similar Messages

  • Crystal Report 9 slow down when we move old DB to new DB server

    We are using Crystal Reports 9 version, recently we move to new database server and reports data source are pointing to Old database server and old database server is no longer in the network.
    Now we have issue with the performance when we call the reports from web application, it is talking long time to retrieve the simple report data(e.g. 1 min.). , if we manually update the report data source pointing to new server and the reports are pulling data very quickly.(e.g. 1 second for the same report) Is there any easy solution to bulk update all the reports datasource pointing to new server, we have 1000 reports, we can not use manual process, it will take long time and it is also issue with the future if we change the server.
    Thanks for your help!
    Ram
    Edited by: ram vaishnapu on Aug 8, 2008 8:09 PM
    Edited by: ram vaishnapu on Aug 8, 2008 8:12 PM

    The 1 minute timeout is due to ODBC driver trying to connect, it's the default timeout you are seeing. There is not tool Cr has to migrate/update database connection info. You can develop your own though or search the web for a third party tool, I'm sure there are lots out there.

  • Error while accessing SharePoint 2013 news feed REST api - "The server encountered an error processing the request. See server logs for more details."

    Hi Experts,
    I am facing an issue while accessing SharePoint 2013 news feed REST api URL <SiteCollectionURL>/_api/social.feed/my/news from browser giving error "The server encountered an
    error processing the request. See server logs for more details."
    This is happening after posting the image to news feed without entering any text or description with that. If i post an image with some text or description, then i can able to get the feeds. Or else if i delete the image post then also i can able to get
    the feeds.
    I can able to see below logs in log files.
    Exception occured in scope Microsoft.Office.Server.Social.SPSocialRestFeed._SerializeToOData. Exception=System.MissingMethodException: No parameterless constructor defined for this object.     at System.RuntimeTypeHandle.CreateInstance(RuntimeType
    type, Boolean publicOnly, Boolean noCheck, Boolean& canBeCached, RuntimeMethodHandleInternal& ctor, Boolean& bNeedSecurityCheck)     at System.RuntimeType.CreateInstanceSlow(Boolean publicOnly, Boolean skipCheckThis, Boolean fillCache,
    StackCrawlMark& stackMark)     at System.RuntimeType.CreateInstanceDefaultCtor(Boolean publicOnly, Boolean skipCheckThis, Boolean fillCache, StackCrawlMark& stackMark)     at System.Activator.CreateInstance(Type type, Boolean nonPublic)
        at System.Activator.CreateInstance(Type type)     at Microsoft.SharePoint.C...
    ...lient.ValueTypeConverter.<GetODataProperties>d__2.MoveNext()     at Microsoft.Data.OData.Atom.ODataAtomPropertyAndValueSerializer.WriteProperties(IEdmStructuredType owningType, IEnumerable`1 cachedProperties, Boolean isWritingCollection,
    Action beforePropertiesAction, Action afterPropertiesAction, DuplicatePropertyNamesChecker duplicatePropertyNamesChecker, EpmValueCache epmValueCache, EpmSourcePathSegment epmSourcePathSegment, ProjectedPropertiesAnnotation projectedProperties)    
    at Microsoft.Data.OData.Atom.ODataAtomPropertyAndValueSerializer.WriteComplexValue(ODataComplexValue complexValue, IEdmTypeReference metadataTypeReference, Boolean isOpenPropertyType, Boolean isWritingCollection, Action beforeValueAction, Action afterValueAction,
    DuplicatePropertyNamesChecker duplicatePropertyNa...
    ...mesChecker, CollectionWithoutExpectedTypeValidator collectionValidator, EpmValueCache epmValueCache, EpmSourcePathSegment epmSourcePathSegment, ProjectedPropertiesAnnotation projectedProperties)     at Microsoft.Data.OData.Atom.ODataAtomPropertyAndValueSerializer.WriteProperty(ODataProperty
    property, IEdmStructuredType owningType, Boolean isTopLevel, Boolean isWritingCollection, Action beforePropertyAction, EpmValueCache epmValueCache, EpmSourcePathSegment epmParentSourcePathSegment, DuplicatePropertyNamesChecker duplicatePropertyNamesChecker,
    ProjectedPropertiesAnnotation projectedProperties)     at Microsoft.Data.OData.Atom.ODataAtomPropertyAndValueSerializer.WriteProperties(IEdmStructuredType owningType, IEnumerable`1 cachedProperties, Boolean isWritingCollection, Action beforePropertie...
    ...sAction, Action afterPropertiesAction, DuplicatePropertyNamesChecker duplicatePropertyNamesChecker, EpmValueCache epmValueCache, EpmSourcePathSegment epmSourcePathSegment, ProjectedPropertiesAnnotation projectedProperties)     at Microsoft.Data.OData.Atom.ODataAtomPropertyAndValueSerializer.WriteComplexValue(ODataComplexValue
    complexValue, IEdmTypeReference metadataTypeReference, Boolean isOpenPropertyType, Boolean isWritingCollection, Action beforeValueAction, Action afterValueAction, DuplicatePropertyNamesChecker duplicatePropertyNamesChecker, CollectionWithoutExpectedTypeValidator
    collectionValidator, EpmValueCache epmValueCache, EpmSourcePathSegment epmSourcePathSegment, ProjectedPropertiesAnnotation projectedProperties)     at Microsoft.Data.OData.Atom.ODataAtomPropertyAndValueSeriali...
    ...zer.WriteCollectionValue(ODataCollectionValue collectionValue, IEdmTypeReference propertyTypeReference, Boolean isOpenPropertyType, Boolean isWritingCollection)     at Microsoft.Data.OData.Atom.ODataAtomPropertyAndValueSerializer.WriteProperty(ODataProperty
    property, IEdmStructuredType owningType, Boolean isTopLevel, Boolean isWritingCollection, Action beforePropertyAction, EpmValueCache epmValueCache, EpmSourcePathSegment epmParentSourcePathSegment, DuplicatePropertyNamesChecker duplicatePropertyNamesChecker,
    ProjectedPropertiesAnnotation projectedProperties)     at Microsoft.Data.OData.Atom.ODataAtomPropertyAndValueSerializer.WriteProperties(IEdmStructuredType owningType, IEnumerable`1 cachedProperties, Boolean isWritingCollection, Action beforePropertiesAction,
    Action afterPropertiesAct...
    Can anyone please help me out.
    Thanks!
    dinesh

    O365,
    Is this still an issue?
    Thanks!
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

  • Server is unwilling to process request with New-ADuser, Csv-Import.

    #>
    function Read-OpenFileDialog([string]$WindowTitle, [string]$InitialDirectory, [string]$Filter = "CSV (*.csv)|*.csv", [switch]$AllowMultiSelect)
    Add-Type -AssemblyName System.Windows.Forms
    $openFileDialog = New-Object System.Windows.Forms.OpenFileDialog
    $openFileDialog.Title = $WindowTitle
    if (![string]::IsNullOrWhiteSpace($InitialDirectory)) { $openFileDialog.InitialDirectory = $InitialDirectory }
    $openFileDialog.Filter = $Filter
    if ($AllowMultiSelect) { $openFileDialog.MultiSelect = $true }
    $openFileDialog.ShowHelp = $true # Without this line the ShowDialog() function may hang depending on system configuration and running from console vs. ISE.
    $openFileDialog.ShowDialog() > $null
    if ($AllowMultiSelect) { return $openFileDialog.Filenames } else { return $openFileDialog.Filename }
    $filePath = Read-OpenFileDialog -WindowTitle "Select CSV File" -InitialDirectory 'C:\Users\%USERNAME%' -Filter "Excel Spreadsheet (*.csv)|*.csv"
    if (![string]::IsNullOrEmpty($filePath)) { Write-Host "You selected the file: $filePath" }
    else { "You did not select a file." }
    Write-Host "This script will grab the path designated in the path column, make sure it is correct."
    Write-Host 'Press Enter to Continue...';
    Read-Host " "
    Import-Module ActiveDirectory
    Import-Csv $filePath | ForEach-Object{
    New-ADUser `
    -SamAccountName $_.sAMAccount`
    -Name $_.Name`
    -DisplayName $_.Name`
    -GivenName $_.cn`
    -SurName $_.sn`
    -Description $_.Description`
    -Department $_.Department`
    -EmailAddress $_.PrimaryEmail`
    -Path $_.Path`
    -AccountPassword (ConvertTo-SecureString -AsPlainText $_.Password -Force)`
    -Enabled $True `
    -PasswordNeverExpires $True `
    -HomeDirectory $_.HomeShare`
    -HomeDrive $_.HomeDrive`
    -ScriptPath $_.LogonScript`
    -PassThru `
    Here is my code, I know there are back ticks, dont hate .. I am getting a "server is unwilling to process request with New-ADuser. Am I needing another variable for the New-ADUser? Everything else works .

    Is this the correct format for path?
    >school.local
    ->ou-User Accounts
    --->ou-Students
    ------>ou-2015
    OU=User Accounts,OU=Students,OU=2015,DC=school,DC=local

  • New VM Server - Very Slow Performance with QuickBooks

    Hello all,
    Oh the dreaded performance issues!
    Okay we currently have a virtual server hosted with a different company - it has two VPU's and 8GB RAM.  The contract is about to expire so we're switching to a Microsoft Azure VM.  This server is running Windows Server 2008 R2 - main purpose is
    to host QuickBooks Enterprise v15 - with 10 users connecting via RemoteApp and/or RD.
    So for the new Azure server we selected the A3 (4 cores/7GB RAM) option - running Windows Server 2008 R2 - I know old O/S - but still need for now.  We also attached a new 250GB disk for additional storage of data/programs.  The test users noticed
    performance issues right away so we temporarily upgraded it to the A4 option (8 cores/14GB RAM) and although it is better there are still performance issues.  For example to run a statement report in QuickBooks - viewing both connections side-by-side
    - it is 2x faster on the current "live" server we have hosted with the current hosting provider.  On the Azure VM we also tested installing QuickBooks on the C:\ drive and then on the attached 250GB disk - the F:\ drive - same performance results
    on both.  
    This is a new VM...fresh install....just so strange why it would be slower than the 8GB server.
    Does anyone have any ideas on where we can start to troubleshoot this?  
    Thanks so much!

    I understand the performance seen by end users are low. However, do you know what is really wrong? Do you see Azure VM using almost 80% of CPU & RAM, if not adding more RAM & CPU may not really help you.
    - How is the CPU usage of your on-premise node vs Azure VM
    - How is the memory usage of your on-premise node vs Azure VM
    Do you have any specific characteristics for Disk IO requirement? Can you check IOPS on on-premise vs Azure?

  • Voicemail access issue when migrated to new Domino server

    st1\:*{behavior:url(#ieooui) }
    /* Style Definitions */
    table.MsoNormalTable
    {mso-style-name:"Table Normal";
    mso-tstyle-rowband-size:0;
    mso-tstyle-colband-size:0;
    mso-style-noshow:yes;
    mso-style-parent:"";
    mso-padding-alt:0in 5.4pt 0in 5.4pt;
    mso-para-margin:0in;
    mso-para-margin-bottom:.0001pt;
    mso-pagination:widow-orphan;
    font-size:10.0pt;
    font-family:"Times New Roman";}
    At the current time we have one Domino v7.0.4 server (Win2k with DUC v1.2.3) in production and our clients are running Notes v7.0.4 client with DUC v.1.2.3 installed on them. This setup has been working great in our Unity v5.0(1) environment. We are looking to migrate our users to a newly installed Domino v8.5.1 (Win 2008 x32 DUC v1.2.5) server in the near future but out test users are running into a problem.
    When a user gets migrated over to our new Domino server the client can no longer access their voicemail via TUI.  During the user migration process we do change the pointer in Unity to their mail file on the new Domino server. When the user logs in they receive "Your messages are available now"
    If we migrate the user back to the old v7.0.4 server they can access their voice mail with no issues.
    Any feedback you could provide would be great. Thanks!

    Hi Kenneth,
    Could you clarify what you mean by "During the user migration process we do change the pointer in Unity to their mail file on the new Domino server"? Typically when moving a user's mail file to a new Domino server, Unity should automatically pick up those changes. Are you pehaps creating a new mail file and pointing Domino to it instead of moving it? If that is the case, the new mail file would not have been DUCS enabled, a process it must go through before Unity will be able to access it properly. The only way to force that to happen would be to reimport the user into Unity. If the mail file is properly migrated, however, the mail file should remain DUCS enabled and there would be no need for this.
    An easy way to determine whether the user is DUCS enabled is using the DUT tool. It is located on the Unity server under \Commserver\Utilities\Domino\DUT. It will show you a list of all of your users. Compare the values you see in the UCProfile section for a working / non-migrated user to a failing one. You might also want to verify that it lists the updated mail server and mail file for the user.
    Also, while I don't think it would cause this particular problem, you should be aware that Server 2008 is not currently a supported platform for DUCS/CsServer 1.2.5, see page 6 in the admin guide:
    http://www.cisco.com/en/US/docs/voice_ip_comm/unity/duc/admin/guide/cuducag.pdf
    Hope this helps,
    Pat

  • LDAP is not working on new Web Server

    Hi, I configured LDAP authentication and it was working fine. After this I installed a new web server. I copied the security certificate etc ( copied everything from other web server) onto new webserver. When I try to login into Infoview or CMC then I get error message "Security plugin error: Failed to set parameters on plugin" from the new web server. Its working fine from the old web servers.
    Not sure what else I have to do now as I have done the same thing on old servers in the past.
    We are on BOXI R2 SP3 with web server on IIS 6.0. We have three web servers  and 3 processing servers ( with all services) in cluster env.
    Thanks,

    I am sorry I got confused with Pure Enterise authentication. I should have referred to my notes. I apologize for this. Web application server is involved in communicating to LDAP. Below is the process
    1) User logs into the application
    2) Web application server security plugin sends credential to LDAP directory
    3) LDAP directory authenticate users.
    4) Web application server's security plugin sends users credential to LDAP
    5) CMS requests user and group info from LDAP
    6) The LDAP returns this information to CMS Security plugin
    7) CMS will grant access if users is member of mapped group
    8) If access is granted then both CMS and WAS plugin create a session
    9) The WAS sends an enterprise session token to user's browser
    I was referring to login into Infoview using LDAP.
    Thanks,

  • Crystal Report 9 performance issue wehn we move old DB to new DB server

    We are using Crystal Reports 9 version, recently we move to new database server and reports data source are pointing to Old database server and old database server is no longer in the network.
    , Now we have issue with the performance, if the we call the reports from web application, it is talking long to retrieve the report at least 1 min.. , if we manually update the report data source pointing to new server and the reports are pulling in 1 second, Is there any easy solution to update all the report pointing to new server, we have 3000 reports, we can not use manual process, it will take long time and it is also issue with the future if we change the server.
    Thanks!
    Ram

    The 1 minute timeout is due to ODBC driver trying to connect, it's the default timeout you are seeing. There is not tool Cr has to migrate/update database connection info. You can develop your own though or search the web for a third party tool, I'm sure there are lots out there.

  • Error when creating a new Reports Server instance

    Hi,
    I have tried to create and start a new reports server instance using the following method:
    1- Creating the report server instance:
    rwserver server=%newrepserver% start
    2- Stopping the OPMN:
    *%ORACLEHOME%\opmn\bin\opmnctl stopall*
    3- Adding a new server target to OPMN.XML
    *%ORACLEHOME%\bin\addNewServerTarget.bat %newrepserver%*
    4- Updating the configuration with new OPMN settings:
    *%ORACLEHOME%\dcm\bin\dcmctl.bat updateconfig -ct opmn -v -d*
    *%ORACLEHOME%\dcm\bin\dcmctl.bat resyncinstance -v -d*
    5- Starting the OPMN again:
    *%ORACLEHOME%\opmn\bin\opmnctl startall*
    However, when I try to start the OPMN again, I receive the following error:
    opmn id="##########:6200 5 of 6 processes started
    ias-instance id=red.########.########.lan
    ++++++++++++++++++++++++++++++++++
    ias-component/process-type-process-set:
    *%newrepserver%/ReportsServer/%newrepserver%*
    Error --> Process (pid=6888) failed to start a managed process after the maximum retry limit.
    Log: %ORACLEHOME%\opmn\logs\%newrepserver%~ReportsServer~%newrepserver%
    Any ideas as to why the reports server instance cannot be started?
    I have tried rebooting the server many times and have recreated other reports server instances but with no luck.
    Many thanks in advance,
    Chris

    On Windows this is caused most often by the temporary directory setting.
    Change the temporary directory to some fixed directory writable by anybody, e.g. "C:\temp".
    Change this in:
    Registry entry: hkey_local_machine\software\oracle\key_<oashome>\REPORTS_TMP
    <oashome>\opmn\conf\opmn.xml: search for environment variable TEMP (under ias-instance)
    Good luck!

  • Power Bi for o365 - Odata connection test worked but "The server encountered an error processing the request. See server logs for more details". Port 8051? Authority\System

    We set up the Data Management Gateway and created a new data source (odata to SQL via sqL user)
    Did a connection test and it was successful!
    Tried the URL (maybe it needs more):
    https://ourdomain.hybridproxy.powerbi.com/ODataService/v1.0/odatatest
    That resolves to some :8051 port address and then spits out this message:
    The server encountered an error processing the request. See server logs for more details.
    I checked and the data management gateway is running.
    Does that 8051 port need to be opened on our firewall for this server? How can I confirm that is the issue.. I see no event on the server indicating this is the issue?
    I am seeing this event:
    Login failed for user 'NT AUTHORITY\SYSTEM'. Reason: Failed to open the explicitly specified database 'PowerBiTest'. [CLIENT: IP of the Server]

    O365,
    Is this still an issue?
    Thanks!
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

  • New M5000 server - Can't get bge0 to work in the OS

    Just recently powered up a new M5000 server. Went through the setup process of setting the networking details of the management card - which I believe also works through ethernet0 (LAN#0). That is working just fine and is what I have used to access the system console. Single domain, nothing odd in the configuration.
    The pre-installed Solaris sees bge0 and bge1. It also shows the ppp connection for the dscp.
    if config shows a normal adapter config but without the RUNNING flag. cfgadm shows the drivers properly loaded. There are cables connected to both NIC cards. The same result, btw, on the bge1 card. The NIC's show activity LED's so there is definitely a physical connection there.
    What else can I provide to get assist anyone in helping me solve this problem?
    Thanks
    Norm Dressler

    Hi all, I'm new to the forum
    Have installed the above mentioned server running the x86 platform with Solaris 10 1/08 with broadcom ethernet adaptor cards.
    My network connections are giving issues. bge0 and bge1both show physical activity and these are my configs
    # ifconfig -a
    lo0: flags=2001000849<UP,LOOPBACK,RUNNING,MULTICAST,IPv4,VIRTUAL> mtu 8232 index 1
    inet 127.0.0.1 netmask ff000000
    bge0: flags=1000843<UP,BROADCAST,RUNNING,MULTICAST,IPv4> mtu 1500 index 2
    inet 192.168.1.170 netmask ffffff00 broadcast 192.168.1.255
    ether 0:11:25:22:1f:26
    bge0: flags=1000843<UP,BROADCAST,RUNNING,MULTICAST,IPv4> mtu 1500 index 3
    inet 10.0.0.1 netmask ffffff00 broadcast 10.255.255.255
    ether 0:11:25:22:1f:27
    My default router also exists
    # vi /etc/defaultrouter
    192.168.1.254
    ~
    The hosts file
    # vi /etc/hosts
    ::1          localhost
    127.0.0.1 localhost
    192.168.1.170 hostname1      loghost
    10.0.0.1      hostname1
    ~
    ~
    And the device files
    # vi /etc/hostname.bge0
    hostname1
    ~
    ~
    # vi /etc/hostname.bge1
    hostname2
    ~
    ~
    All these including restart procedures aren't working.
    I dont seem to be able to get to other devices on the network yet the local TCP/IP stack behaves just OK. I can ping all the interfaces but cannot ping any device on the network.
    During install too, I attempted to acquire a dhcp address via the network from a dhcp server that serves IPs to our network but failed thus proceeded to static addressing.
    Any advice please?

  • Issue when Bursting reports via new Precalculation Server

    Hello,
    We are trying to migrate away from our unsupported 720 installation of the SAP Precalculation software to a new Precalculation Server running the latest versions. We’ve overcome a number of issues and we can successfully Broadcast using the new server but we are encountering problems
    with bursting.
    The Bursts have been running successfully on the old 720 server, so we know that the BW side must be fine which suggests the problem is with the Precalculation server. We have been through the Precalculation checklist and the servers appears to have been built successfully. The Precalculation and Business Explorer software has been patched to the latest version and it’s running Excel 2013.
    The main error message we are seeing is “The RPC server is unavailable” HRESULT: 0x800706BA.
    A Screen shot showing the error in RSRD_LOG  is shown in the attached document.
    Below is an extract from the Log of the Precalc server showing this error, just before the RPC Server is unavailable message it states
    Error occured on closing opened workbooks.
    |ZPREC730_1:9/3/2014 2:10:20 PM.777 (0) -> Refresh
    BExAnalyzer.xla!MenuRefreshPrecalc failed 1 time(s | 
    |). BExAnalyzer.xla!MenuRefreshPrecalc failed 1 time(s                                             

    |ZPREC730_1:9/3/2014 2:10:25 PM.785 (0) -> Calling refresh
    BExAnalyzer.xla!MenuRefreshPrecalc       
    |
    |ZPREC730_1:9/3/2014 2:10:25 PM.785 (0) -> Refresh
    BExAnalyzer.xla!MenuRefreshPrecalc failed 2 time(s |

    |). BExAnalyzer.xla!MenuRefreshPrecalc failed 2 time(s                                             

    |ZPREC730_1:9/3/2014 2:10:30 PM.792 (0) -> Calling refresh BExAnalyzer.xla!MenuRefreshPrecalc        |

    |ZPREC730_1:9/3/2014 2:10:30 PM.792 (0) -> Refresh
    BExAnalyzer.xla!MenuRefreshPrecalc failed 3 time(s

    |). BExAnalyzer.xla!MenuRefreshPrecalc failed 3 time(s                                                |

    |ZPREC730_1:9/3/2014 2:10:35 PM.800 (0) -> Calling refresh
    BExAnalyzer.xla!MenuRefreshPrecalc       

    |ZPREC730_1:9/3/2014 2:10:35 PM.800 (0) -> Refresh
    BExAnalyzer.xla!MenuRefreshPrecalc failed 4 time(s |

    |). BExAnalyzer.xla!MenuRefreshPrecalc failed 4 time(s                                             

    |ZPREC730_1:9/3/2014 2:10:40 PM.808 (0) -> Calling refresh
    BExAnalyzer.xla!MenuRefreshPrecalc       

    |ZPREC730_1:9/3/2014 2:10:40 PM.808 (0) -> Refresh BExAnalyzer.xla!MenuRefreshPrecalc
    failed 5 time(s |

    |). BExAnalyzer.xla!MenuRefreshPrecalc failed 5 time(s                                             

    |ZPREC730_1:9/3/2014 2:10:45 PM.815 (0) -> Refresh
    BExAnalyzer.xla!MenuRefreshPrecalc returned with 0 |

    |. BExAnalyzer.xla!MenuRefreshPrecalc returned with 0                                               

    |ZPREC730_1:9/3/2014 2:10:45 PM.815 (0) -> Error occured on closing
    opened workbooks.                |

    |ZPREC730_1:9/3/2014 2:10:45 PM.815 -> An Exception  occured in thread '0':                          |

    |ZPREC730_1:The RPC server is unavailable. (Excepti on from HRESULT:
    0x800706BA)                      |

    |ZPREC730_1:System.Runtime.InteropServices.COMExcep tion (0x800706BA):
    The RPC server is unavailable. |

    |(Exception from HRESULT: 0x800706BA) tion (0x800706BA): The RPC server
    is unavailable.              |
    When this error occurs an error is reported in the Event Viewer on the server
    Faulting application name: EXCEL.EXE, version: 15.0.4535.1507, time stamp: 0x52282875
    Faulting module name: EXCEL.EXE, version: 15.0.4535.1507, time stamp: 0x52282875
    Exception code: 0xc0000005
    Fault offset: 0x005b447e
    Faulting process id: 0x%9
    Faulting application start time: 0x%10
    Faulting application path: %11
    Faulting module path: %12
    Report Id: %13
    We have the latest 730 patches installed on the Precalculation and Business Explorer software. We are running Excel 2013 (32bit) on a Virtual Machine running Windows 2008 R2.
    If you have any suggestions on how to resolve this problem I'd be delighted to hear from you!
    Many thanks,
    Mark

    Hi,
    There is no easy fix for this. Please go through the precalc check list which should solve the issue:
    Checklist for Precalculation Server - SAP NetWeaver Business Warehouse - SCN Wiki
    Regards,
    Michael

  • How to add new database server instance in Sharepoint 2010?

    I have installed SQL server 2012 BI edition and I have been trying to add new SQL Server reporting services in Sharepoint 2010 central admin .But When I click "ok "after giving the new SQL server name in the SQL Server reporting service creating
    process, I get an error stating "This user does not have permission on the SQL server".Though ,I have given this user dbcreator role on that particular database instance.Could you please help me out?

    Does the initial installation account used to set-up the farm have these permissions on the database?
    Steven Andrews
    SharePoint Business Analyst: LiveNation Entertainment
    Blog: baron72.wordpress.com
    Twitter: Follow @backpackerd00d
    My Wiki Articles:
    CodePlex Corner Series
    Please remember to mark your question as "answered" if this solves (or helps) your problem.

  • My acrobat plugin application crashes in Citrix server

    Hi,
          The old version of my plugin application was able to run on acrobat installed  in citrix server.In the old version, resource file was present in  the plugin project folder itself.
    In the new version of my plugin application,I maintained separate project folder containing resource files and created the dll for that,which gets loaded in my plugin project.This is the only difference between my old and new version.
    In Citrix server,   I am able to install the application in the acrobat,after installation if I click on acrobat.I get run time error  saying-
    Run time error!!
    Program: <acrobat installation path>
    abnormal program termination!!
    I  developed the plugin using acrobat sdk7.
    What has might the problem or went wrong so that application crashes on server?
    please help,as i have no idea on citrix servers.newto this environment.
    Thanks in advance,
    Sind

    Hi,
    Can you check if you are experiencing the same crash on other machine as well ? What does your plugin application do ?

  • How to do patch upgradation in citrix server ?

    Dear all,
    How to do patch upgradation in citrix server (SAP B1 2005B PL41 to PL49) ?
    Whether i also need to do same in db server ?
    Jeyakanthan

    Hi Jayakanthan,
    Patch Upgrade is done on SAP Server only.After Completion of Patch 49 upgrade in SAP Server
    then open SAP B1 in server machine it start to upgrade DB after completed DB Upgrade
    In Citrix Server open the SAP B1 it start the upgrade process after completed
    restart the citrix server.
    There is no separate Patch instalation in citrix server.
    in Citrix server SAP client only installed so you can just open the
    SAP B1 in citrix server and other client machine also it automatically start to
    upgrade.
    *Close the thread if issue solved.
    Regards
    Jambulingam.P

Maybe you are looking for