Data Packages not hitting server

Hi,
I have an issue where the users log in remotely and submit data packages. When they run the package they get a successfully submitted message, but when they have a look on the View Status box the package never arrived.
I have to admit this only happens when they run the packages for a lot of entities, but this should not be happening. Have anyone encountered this problem before?
Regards,
Andries

I have faced this same problem, now running on BPC 5.1 SP8 (on SQL 2005) but first starting back on 5.1 SP3 or thereabouts. Support notes / release notes claims it was fixed sometime around SP4, but I am still facing it on SP8.
It's sporadic, and I find it happens on both a wireless and ethernet LAN connection from the client. I even find it happens when I'm running Excel on the server, connecting via an RDP connection.
I've filed a message with SAP support, but haven't yet had the time to reproduce it in ApShell (which is difficult to do, since it's sporadic.)
I've set up data manager debugging logs on both client and server, and find that they are of very little use, since when this problem occurs, exactly nothing is logged to either log.
Restarting the COM apps on the server always seems to resolve the problem; afterward the packages always seem to run on the first attempt.
I've also noticed that while one user cannot run a package (even after 5 or 10 or 20 attempts), other users can run packages on their first attempt. Then an hour later when the admin bounces the COM apps, the first user can run their package fine.
The problem seems to occur regardless of the type of package. Most packages are simple script logic, calling a custom logic file (normally LGF, not LGX) but some are calling the standard copy package, or custom DTS packages that run a stored procedure directly against the database.

Similar Messages

  • BIA Server problem - Data package not consistent

    Dear All,
    I have a strange issue with my BIA server. When I run the report by selecting top most nodes of the hierarchy.
    Report ends up with message: No applicable data, and below error messages
    Aggregates of ZFICSC22 were compressed. Data package is not consistent
    Error while reading data, navigation is possible
    The report works fine when I select the subsequent nodes (immediate lower nodes) of a hierarchy.
    The report also works when I execute bypassing the BIA
    I have cleaned up all the hierarchy buffer but the problem persists
    As of now I am left out with last option which is rebuilding the indexes unless any one of you comes up with a better solution.
    Thanks & Regards
    Ziya
    Edited by: Ziyauddin Mohammed on Dec 5, 2008 4:02 PM

    Hi,
    Have you implemented OSS Note 1163991?
    There is nothing like data package compression in BIA.
    Regards,
    -Vitaliy
    Edited by: Vitaliy on Dec 5, 2008 9:08 AM

  • Out-of-date package not been updated in almost a year

    The question I have is: what should I do when I see a package that has not been updated in almost a year (after its flag-out-of-date)? I can't seem to find any info on how such a situation is handled. I can't reflag and sending an email to the maintainer seems intrusive and/or rude. There may be an underlying reason to not upgrade, or the flag was lost in time and forgotten.
    The package I have in mind is opencl-headers. It got stuck on v1.1, and never updated to v1.2, and now still not to v2.0. There is an AUR package for the 1.2 headers, but that just seems silly to have around.
    Thanks for any input!

    I see, there is a backstory to all this. It seems though momentum was lost when ocl-icd devs did not respond to the email requesting info. The last commit to their repo was for 2.0 support, back in 2013. It seems abandoned since then, maybe a new spark of activity will come when a new OpenCL spec comes out.
    The current situation is quite terrible really: libcl in the repos is from the NVIDIA package, so this means a hard dependency on a vendor not willing to follow the latest version of the spec. Way back when it was probably a sane choice, but now, as the email linnked by Allan shows, the situation is different.
    Either choice, be it ocl-icd or Khronos, will improve the situation (OpenCL version and license-wise). Currently ocl-icd has one advantage (detailed in the POCL developer docs): choosing the icd at runtime.
    As any CUDA or AMDAPP or Intel OpenCL package will include their libOpenCL, and it can be selected by LD_LIBRARY_PATH if required, I hope a sane libOpenCL package will soon make it into the repo.
    As for backward compatibility: I believe the headers are backwards compatible, and it is up to the applications to verify the device they want to use supports the functionality they use. As for forward compatibility: if one solution fails and an alternative is better: just replace the package providing the ICD library by a better one.
    Note that both the OpenCL headers and the ICD (libOpenCL) are designed to be platform-agnostic and should in principle work with all versions of the spec and devices, limited only by the device capabilities as reported through OpenCL functions. Currently, libcl and the old opencl-headers version is the limiting factor, which is Terrible®, given the (open source) alternatives.

  • Error trying to run SSIS Package via SQL Server Agent: DTExec: Could not set \Package.Variables[User::VarObjectDataSet].Properties[Value] value to System.Object

    Situation:
    SSIS Package designed in SQL Server 2012 - SQL Server Data Tools
    Windows 7 - 64 bit.
    The package (32 bit) extracts data from a SQL Server db to an Excel Output file, via an OLE DB connection.
    It uses 3 package variables:
    *) SQLCommand (String) to specify the SQL Statement to be executed by the package
    Property path: \Package.Variables[User::ExcelOutputFile].Properties[Value]
    Value: f:\Output Data.xls
    *) EXCELOutputFIle (String) to specify path and filename of the Excel output file
    Property path: \Package.Variables[User::SQLCommand].Properties[Value]
    Value: select * from CartOrder
    *) VarObjectDataSet (Object) to hold the data returned by SQL Server)
    Property path: \Package.Variables[User::VarObjectDataSet].Properties[Value]
    Value: System.Object
    It consists out of 2 components:
    *) Execute SQL Task: executes the SQL Statement passed on via a package variable. The resultng rows are stored in the package variable VarObjectDataSet
    *) Script Task: creates the physical output file and iterates VarObjectDataSet to populate the Excel file.
    Outcome and issue:The package runs perfectly fine both in SQL Server Data Tools itself and in DTEXECUI.
    However, whenever I run it via SQL Server Agent (with 32 bit runtime option set), it returns the errror message below.
    This package contains 3 package variables but the error stating that a package variable can not be set, pops up for the VarObjectDataSet only.  This makes me wonder if it is uberhaupt possible to set the value of a package variable
    of type Object.
    Can anybody help me on this please ?
    Message
    Executed as user: NT Service\SQLSERVERAGENT. Microsoft (R) SQL Server Execute Package Utility  Version 11.0.2100.60 for 32-bit  Copyright (C) Microsoft Corporation. All rights reserved.    Started:  6:40:20 PM  DTExec: Could
    not set \Package.Variables[User::VarObjectDataSet].Properties[Value] value to System.Object.  Started:  6:40:20 PM  Finished: 6:40:21 PM  Elapsed:  0.281 seconds.  The package execution failed.  The step failed.
    Thank you very much in advance
    Jurgen

    Hi Visakh,
    thank you for your reply.
    So, judging by your reply, not all package variables used inside a package need to be set a value for when run in DTEXEC ?
    I already tried that but my package ended up in error (something to do with "... invocation ...." and that error is anything but clearly documented. Judging by the error message itself, it looks like it could be just about anything. that is why I asked my
    first question about the object type package variable.
    Now, I will remove it from the 'set values' list and try another go cracking the unclear error-message " ... invocation ...". Does an error message about " ... invocation ..." ring any bells, now that we are talking about it here ?
    Thx in advance
    Jurgen
    Yes exactly
    You need to set values only forthem which needs to be controlled from outside the package
    Any variable which gets its value through expression set inside package or through a query inside execute sql task/script task can be ignored from DTExec
    Ok I've seen the invocation error mostly inside script task. This may be because some error inside script written in script task. If it appeared after you removed the variable then it may because some reference of variable existing within script task.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Data Package 1 arrived in BW ; Processing :2nd processing step not finished

    I am currently working on BI 7 and trying to pull data from r/3.
    I have succesfully loaded the data into the PSA and now when i try updating the data (only 66 records, its a init run) into an ODS, but the system is taking a long time. On monitor Details in
    Extraction:Errors occured.(with a delta warning sign), and
    Transfer:Data Package 1 arrived in BW ; Processing :2nd processing step not     finished.
    On status Tab its yellow with message : Correct data request; processing running.
    Hope i have made it clear.Please help.

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    Once you identify you can rectify the error.
    If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the new data table.

  • LS Getting the error "The data could not be saved because the server could not be contacted" when deploying on my local desktop

    Hi:
    I have a very simple test LS App:
    1 Table "Person" with 1 property "Name"
    One totally standard Editable Grid
    So no complicated calculations that could timed out...
    When I hit F5 the screen launches and shows the data, but when I try to create a new Person it waits for about 2 minutes and then the following error appears:
    The data could not be saved because the server could not be contacted. Please check your network connection and try saving again. The Operation timed out.
    Despite the error the data gets persisted on the database.
    I am on VS 2013 Pro Update 4 and SQL Server LocalDB 2012 (v11.0).

    If you decide to try again I would:
    1) Uninstall Visual Studio
    2) Uninstall SQL Server (all versions)
    3) Re-install SQL Server
    4) Re-Install Visual Studio
    Unleash the Power - Get the LightSwitch 2013 HTML Client / SharePoint 2013 book
    http://LightSwitchHelpWebsite.com

  • Data are Not Uplaod in the BPC server.

    Hi
    Recently I Uploaded   the Data Through the DATAMANAGER in the SAP BPC 5.1 Microsoft Plat Form .
    Its Shows File Transferred Successfully .
    But i cheked In the View Status , DATA was not uplaod .
    What is Cause of Error .
    Regards
    Dayalan M

    Hi,
    I think you are mixing two things. File transferred successfully will of course be displayed when you upload it on the server, but this is not linked at all with the "View Status" window.
    View Status Window refers to packages (import, export, copy...etc..), and this window will display something only when a package has been or is being run.
    You should try Nilanjan's suggestion above. Hopefully it will help you with this functionality.
    Kind Regards,
    Patrick

  • Comments not imported from Data Dictionary of SQL Server. SDDM 3.3.0.747

    Hi,
    SDDM 3.3.0.747 32-bit on Windows 7 64-bit.
    Comments are not imported from Data Dictionary of SQL Server 2008. Connection through Microsoft JDBC Driver 4.0 for SQL Server or jTDS 1.2.7
    What I have tried? In SDDM DDL generation, Comments in DBRMS for SQL Server are generated with "EXEC sp_addextendedproperty 'MS_Description' , 'Test Comment' ..." so I added extended property named "MS_Description" into SQL Server database, both on table and column. None of them were imported from Data Dictionary into SSDM. I have tried both drivers stated above. Is it a bug or am I missing something?
    I've found similar question thread Re: Data dictionary import doesn't import column comments for SDDM 3.0.0.665, so I guess it is a bug when importing with JDBC drivers.
    MiGli
    Edited by: MiGli_1006342 on May 25, 2013 8:32 AM
    Edited by: MiGli_1006342 on May 25, 2013 9:02 AM

    Extended properties were not imported correctly from SQLServer databases at DM 3.3.0.747.
    Calls to sp_addextendedproperty and fn_listextendedproperty have been modified.
    I don't think it is a problem with JDBC drivers.
    A bug fix should be available in the next release of DM.

  • DTP does not fetch all records from Source, fetches only records in First Data Package.

    Fellas,
    I have a scenario in my BW system, where I pull data from a source using a Direct Access DTP. (Does not extract from PSA, extracts from Source)
    The Source is a table from the Oracle DB and using a datasource and a Direct Access DTP, I pull data from this table into my BW Infocube.
    The DTP's package size has been set to 100,000 and whenever this load is triggered, a lot of data records from the source table are fetched in various Data packages. This has been working fine and works fine now as well.
    But, very rarely, the DTP fetches 100,000 records in the first data package and fails to pull the remaining data records from source.
    It ends, with this message "No more data records found" even though we have records waiting to be pulled. This DTP in the process chain does not even fail and continues to the next step with a "Green" Status.
    Have you faced a similar situation in any of your systems?  What is the cause?  How can this be fixed?
    Thanks in advance for your help.
    Cheers
    Shiva

    Hello Raman & KV,
    Thanks for your Suggestions.
    Unfortunately, I would not be able to implement any of your suggestions because, I m not allowed to change the DTP Settings.
    So, I m working on finding the root cause of this issue and came across a SAP Note - 1506944 - Only one package is always extracted during direct access , which says this is a Program Error.
    Hence, i m checking more with SAP on this and will share their insights once i hear back from them.
    Cheers
    Shiva

  • Sharepoint list dataheet view error "Cannot connect to the server at this time. You can continue working with this list, but some data may not be available"

    I have a List which is having around 14000 items in it.while opening that list in datasheet view it is giving error .
    Below is a summary of the issue:
    After selecting datasheet view beow error occurs:
        "Cannot connect to the server at this time.  You can continue working with this list, but some data may not be available."
        "Unable to retrieve all data."
        The item counts displays say 100 out of 14000 items.
    Exporting List to excel is giving only 2000 records out of 14000 records.
    Other Observations   -  
    This is happening to only one list on the site .There are other lists in the site whose no. of records is equal to 8000 to 9000.They are working absolutely fine without any error.
    Also, If I am saving this list as a template and creating another list with it ,then it is working absolutely fine with 14000 records,so the issue does not seem to be related with no. of records as the template list is working fine.
    I have checked the Alternate access mapping setting ,its fine.
    It should not be related to lookup,datefield or any other column as the list created from it template is working fine with all these columns.
    I checked below links also ,but doesn't seem to work in my case.
    http://social.technet.microsoft.com/forums/en-US/sharepointadminprevious/thread/974b9168-f548-409b-a7f9-a79b9fdd4c50/
    http://social.technet.microsoft.com/Forums/en-US/smallbusinessserver/thread/87077dd8-a329-48e8-b42d-d0a8bf87b082
    http://social.msdn.microsoft.com/Forums/en-US/sharepointgeneral/thread/dc757598-f670-4229-9f8a-07656346b9b0

    I have spent two days to resolve this issue. Microsoft has released two KBs with reference to this issue...but are not appearing in search results at the top.
    I am sharing my finding.
    1. First install the
    KB2552989 (Hopefully you might have already installed it. The KB detetcts it and informs the user.)
    2. Then update registry by adding new key for data fetch timeout as mentioned inKB2553007
    These two steps resolved the issue in our environment. Hope it might help others as well.
    Pradip T. ------------- MCTS(SharePoint 2010/Web)|MCPD(Web Development) https://www.mcpvirtualbusinesscard.com/VBCServer/paddytakate/profile

  • Not able to Parse XML data in Sun Application Server 8.2

    Iam not able to parse the xml data in sun application server,this is working fine in tomcat and oracle server.
    Please go thru the code , thanks in advance.This is used in ajax.
    function processStateChange() {
    if (req.readyState == 4) { // Complete
    if (req.status == 200) { // OK response
         var message=req.responseXML.getElementsByTagName("value")[0];
              setMessage(message.childNodes[0].nodeValue);
    //document.getElementById("theResponse").innerHTML = req.responseText;
              else
    alert("Problem: " + req.statusText);
              return true;
         }

    Use Notepad. Drag and drop in notepad.
    Aman

  • HT4623 I want to purchase a ipad 32gb in the larger size but have read many bad reviews that the wifi does not stay connected and I don't want to have to buy another data package as I already own an iphone that i pay data for, what are my concerns

    I want to purchase an ipad 32gb in the larger size buy have read many bad reviews that they have problems staying connected to wifi.  I don't want to have to buy a data package so I only want the wifi one.  I already pay apple for my iphone data and I can't afford more money.  Why are there so many bad reviews and are the newer ones that much better, according to many reviews they are not.Please help

    I already pay apple for my iphone data
    You pay your carrier for data, not Apple.
    Why are there so many bad reviews
    Where? Regardless, there are always going to be a handful of people having some sort of issue with any given device, but I would not therefore assume that you would be one of them.

  • Data form not saving enter data after hit of submit

    Hi,
    Entered data on Data from not saving after hitting the submit button in smartview.
    A rules is attached to the same data from and its running on save, after retrieve it's not display enter data.
    it's urgent, please help.
    Regards

    You need to have the below updates done on the affected PC.
    [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Internet Settings]\
    "ReceiveTimeout"=00dbba00 (hexadecimal)
    "KeepAliveTimeout"=300000 (decimal)
    "ServerInfoTimeout"=300000 (decimal)
    [HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Internet Settings]\
    "ReceiveTimeout"=00dbba00 (hexadecimal)
    "KeepAliveTimeout"=300000 (decimal)
    "ServerInfoTimeout"=300000 (decimal)
    if its windows 7 machine the same key has to be added under wow6432node under same path.
    once the changes are done reboot the machine and try launching the BR. Hope it works !!!
    Thanks
    Amith

  • How to Troubleshoot why data is not moving over into the Data Warehouse after Sql Server Agent Job Run

    Hello,
    Here is my problem:
    Data was imported into the staging area. After resolving some errors and running the job, I got the data to move over to the next area. From there, data should be moving over into the DW.  I have been troubleshooting for hours and cannot reslove this
    issue. I have restarted the sql service services, I have ran a couple packages manually, and the job is running successfully. 
    What are some reasons why data is not getting into the data warehouse? Where should I be looking? 
    Your help is greatly appreciated!!

    Anything is possible.
    So, just to reiterate, running the job manually works, running the scheduled job does not result in errors neither data arriving to the DW, right? And it used to, correct?
    If so, the 1st step would be to examine the configuration(s). But not before you inspect the package. Do you have an ability to export it to a file system and open in BIDS?
    Arthur My Blog

  • Data Package is not consistent error

    Hi All,
    We have a Bex issue the screenshot of which is given in the link below. We have an info cube which has huge data volume. We load it very frequently (every hour from 2 sources). This info cube has been loading from say last 5 years. We stopped compressing the data about an year ago for some reason. We are thinking of compressing the data sometime soon. All of a sudden, users are getting the error shown in the link below more frequently. No much information is available at Service Market place about this issue. Can somebody please throw some light on this?
    [Error Screenshot|https://photos-1.dropbox.com/i/l/orsd6TKDlIBVoH4JO79GWQAPNFPlTa88QJ0zkV7Lcc0/53945346/1324418400/3799537/Bex%20Error.JPG#1]
    Thanks and Regards
    Subray Hegde

    Hi Deepak,
    I am not able to attach a screenshot. Somebody directed me to servimg.com but, I do not know how to create an account in this site. The error reads like this:
    "Error Aggregates of <Infocube> were compressed. Data package is not consistent. Error. Error while reading data; navigation possible.".
    Thanks and Regards
    Subray Hegde

Maybe you are looking for

  • Offline approval - weird characters in Outlook

    Hi all, We're on SRM 5.0, Server 550, SP12. Scenario: offline approval of Shopping Carts After running report RBBP_NOTIFICATION_OFFAPP, approver is getting an e-mail in Outlook 2003 with some weird characters: **START**      The following require

  • No XML generated

    Hello experts, I have problem with generation of XML stack file for ERP Enhancement Package 4. I get warning message "No XML generated" in Maintenance Optimizer on the step 2.2 (Select OS/DB-Dependent Files) I get this message for Enhancement Package

  • Standard text printing missing on Smartform

    Hi All, I am using standard text in Smartform. Standard text contains 'terms and conditions' hence it's used for printing in second page. For printing standard text one window other than Main window has been created in second page. Now when I am taki

  • How do I get to the home page of itunes for IPAD to see my account and my apps?

    How do I get to the home page of itunes to see my account and the apps installed onmy Ipad or Ipod?

  • Can I create a generic Point to Row Column function?

    I have a VI with multiple MultiColumn Listboxes in it.  One way I've seen to make the entries writeable is as follows: This works fine.  The only issue is that to get the 'Point to Row Column' (PtRC) method requires (as I understand it) right-clickin