FI-SL data package too large on delta

Hi Guys,
I`m loading data from FI-SL total extractor ( 3FI_SL_xx_TT )
No problem with Delta Init but when we try to load a regular delta there is some problem with the size of the packages, there are totally irregular package sizes: 70.158, 52.398, 299.784, 299.982, 57243, .. So I`m getting TSV_TNEW_PAGE_ALLOC_FAILED error on transfer from PSA to InfoCube.
Unfortunately we have to load a heavy bunch of data on delta because they want this data monthly (an business issue) so I`m note able to load it in small requests every day.
To make the things worse I have that start routine to carry on Balance (sapnote_0000577644 - DataSource 0EC_PCA_3 and FI-SL line item DataSources).
It`s not taking the package size of the infopackage in consideration (1000 Kbs / 2 processes / 10 pkgs/IDoc Info).
Config:
- BI 7.0 SP 23/AIX/Oracle
- ECC: AIX/Oracle PI_BASIS 2008_1_700 000 / SAP_ABA 700 015 / SAP_BASIS 700 015 / SAP_APPL 600 013
I  checked sap note sapnote_0000917737 - Reducing the request size for totals record DataSources but it not apply to us...
Someone have any idea how to fix it?
Thanks in advance,
Alex

Chubbyd4d wrote:
Hi masters,
I have a package of 11,000 rows coding with around 70 procedures and function in my DB.
I used array to handle the global data, and almost every procedure and function in the package using the array.
The package was creating problem in debugging. Gave me the Program too large error. so it came to an idea to split the package to smaller packages.
However, the problem that I would like to discuss are
what is the advantage to split the package into few smaller packages. How is the impact on the memory.
if I chunk the package into few packages, will it be easier if use GTT instead of array for the global data? How is the impact on the peroformance, since GTT is using I/OOne of our larger packages is over 20,000 lines and around 500 procedures on a 10.2 database.
No problem or complaints about it being too large.
As for splitting packages, that's entirely up to you. Packages are designed to put all related functionality in one place. If you feel a need to split it out to seperate packages then perhaps consider grouping together related functionality on a smaller granularity that you previously had.
In relation to memory, smaller packages will take up less space obviously, but if you are going to be calling all the packages anyway then they'll all get loaded into memory and still take up about the same amount of memory (give or take a little).
GTT's are generally a better idea than arrays if you are dealing with large amounts of data or you need to run queries against that data. If you need to effectively query against the data then you'll probably end up with worse performance processing an array in a query style than the IO overhead of a GTT. Of course you have to look at the individual processes and weigh up the pros and cons for each.

Similar Messages

  • Error!907: Icon data is too large: 18660 bytes

    hello everyone,
    when i am trying to run my blackberry application in JDE,
    getting an error
    "Error!907: Icon data is too large: 18660 bytes"
    can anyone tel me how to resolve this
    thanks in advance

    Hi pallavi
    why dont you see what the blackberry knowledge base has to say.
    [http://www.blackberry.com/knowledgecenterpublic/livelink.exe/fetch/2000/348583/customview.html?func=ll&objId=348583]
    Regards

  • Mkisofs: Value too large for defined data type too large

    Hi:
    Does anyone meet the problem when use mkisofs command?
    <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
    Warning: creating filesystem that does not conform to ISO-9660.
    mkisofs 2.01 (sparc-sun-solaris2.10)
    Scanning iso
    Scanning iso/rac_stage1
    mkisofs: Value too large for defined data type. File iso/rac_stage3/Server.tar.gz is too large - ignoring
    Using RAC_S000 for /rac_stage3 (rac_stage2)
    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
    Thanks!

    An update on the original bug. This is the direct link to launchpad bug 455122:
    https://bugs.launchpad.net/ubuntu/+sour … bug/455122

  • How do you copy a MYOB file on a floppy disc when data is too large?

    re: vintage Mac Performa 600. We are trying to get financial data from old MYOB (Mind Your Own Business)software copied on floppy. Is there a way to split this on multiple discs?
    Thanks! We are stumped.

    Hello, highdef11. Welcome to the forums!
    Many, many, many commercial (and probably a few freeware) products exist to do this.
    There's a Stuffit product (I forget the name) that can create "segmented archives." I think DiskDoubler can do this too. Norton Utilities has a program which allows you to create an backup archive that spreads across several floppy disks.
    However, you may be able to to this with just DropStuff (It comes in the "Internet Utilities" folder on OS 9 Macs) and a text editor.
    Open DropStuff. In its preferences, tell it to encode the archives it creates with BinHex. Now tell DropStuff to archive the files you want to archive. It will create a file with a name ending in .hqx
    Open this file with a text editor. It will look like a bunch of random text, with a note at the top that says the file needs to be decoded with BinHex.
    Divide the file into sections and save them onto floppies. Be sure to note the order in which the sections go!
    Then, when you want to take it off the floppies, use a text editor to piece the sections together again. Expand the file with Stuffit Expander, and you're finished!

  • Data package size

    What is the basic difference between RSCUSTV6 & SBIW->General setting ->Maintain Control parameters in relation of modification of data package format.

    Hi,
    Just see the help on
    Maintain Control Parameters for Data Transfer:
    1. Source System
    Enter the logical system of your source client and assign the control parameters you selected to it.
    You can find further information on the source client in the source system by choosing the path
    Tools -> Administration -> Management -> Client Maintenance.
    2. Maximum Size of the Data Package
    When you transfer data into BW, the individual data records are sent in packages of variable size. You can use these parameters to control how large a typical data packet like this is.
    If no entry was maintained then the data is transferred with a default setting of 10,000 kBytes per data packet. The memory requirement not only depends on the settings of the data package, but also on the size of the transfer structure and the memory requirement of the relevant extractor.
    3. Maximum Number of Rows in a Data Package
    With large data packages, the memory requirement mainly depends on the number of data recrods that are transferred with this package. Using this parameter you control the maximum number of data records that the data package should contain.
    By default a maximum of 100,000 records are transferred per  data package.
    The maximum main memory requiremen per data package is approximately 2  Max. Rows 1000 Byte.
    4. Frequency
    The specified frequency determines the number of IDocs that an Info IDoc is to be sent to, or how many data IDocs an Info Idoc describes.
    Frequency 1 is set by default.. This means that an Info Idoc follows every data Idoc. In general, you should select a frequency between 5 and 10 but no higher than 20.
    The bigger the data IDoc packet, the lower the frequency setting should be. In this way, when you upload you can obtain information on the respective data loading in relatively short spans of time .
    With the help of every Info IDoc, you can check the BW monitor to see if there are any errors in the loading process. If there are none, then the traffic light in the monitor will be green. The Info IDocs contain information such as whether the respective data IDocs were uploaded correctly.
    5. Maximum number of parallel processes for the data transfer
    An entry in this field is only relevant from release 3.1I onwards.
    Enter a number larger than 0. The maximum number of parallel processes is set by default at 2. The ideal parameter selection depends on the configuration of the application server, which you use for transferring data.
    6. Background job target system
    Enter the name of the application server on which the extraction job is to be processed.
    To determine the name of the application server, choose
    Tools -> Administration -> Monitor -> System monitoring -> Server. The name of the application server is displayed in the column Computer.
    7. Maximum Number of Data Packages in a Delta Request
    With this parameter, you can restrict the number of data packages in a delta request or in the repetition of a delta request.
    Only use this restriction when you expect delta requests with a very high data volume, so that, despite sufficiently large data package sizes, more than 1000 data packages can result in a request.
    With an initial value or when the value is 0, there is no restriction. Only a value larger than 0 leads to a restriction in the number of data packages. For reasons of consistency, this number is not generally exactly adhered to. The actual restriction can, depending on how much the data is compressed in the qRFC queue , deviate from the given limit by up to 100.
    RSA6:
    Used to change the Datapacket Size.
    Thanks
    Reddy
    Edited by: Surendra Reddy on Mar 12, 2010 6:27 AM

  • Forcing customers to pay for a phone and/or data package they don't want or need

    By their practices Verizon is forcing customers to pay for more than what the customer wants or needs, and at the same time giving the customer less of what they want or need.
    I've been a Verizon Wireless customer for years. Everytime my 2 year upgrade is due I check what phones are available. It seems that as time goes by there are less and less Basic Phones with a qwerty keyboard, music player, good camera, and video. Now, there are only 3 phones to choose from that don't require a data package: none of them have a camera over 1.3 megapixels, none have video, and ALL had HORRIBLE customer reviews. Those reviews tell me I'm not the only disgruntled Verizon customer.
    I've had Verizon phones that I liked in the past; right now I have a Blitz. I settled for that one because it was the best of the few bad choices available at the time. My most recent 2 year contract has been over for 5 months so now I'm on month-to-month with no contract, which is awesome. But my phone, which I like for the most part, doesn't work right.The problem is the earpiece speaker quit working as soon as my contract was up (maybe a software timebomb to force me to get a new phone?) so I took it to a repair shop to get it replaced. They replaced it but that speaker died after 2 months and since the phone has been discontinued (grr!), there is NO chance of getting a new speaker. That wouldn't be such an issue if the Bluetooth function had ever worked, which it never did. I recently broke down and decided I'd settle on something, so I tried the KinTwo (one of my 3 choices) but it sucked immensely so I sent it back.
    So now I'm forced to buy a phone (with Some of the features I need, because it's obviously impossible to get ALL the features I want/need in One phone) that I don't like; AND pay for a $10 -$30 data package which I don't want or need, and can't afford; AND forced to sign up for a new 2 year contract. I'd rather go with Pay As You Go because I barely use any 'minutes' (I do more texting than talking, which is why I need a good qwerty keyboard) but I'd have to pay over $350 for a fairly decent phone.
    HEY VERIZON, How about providing more than 3 choices of phones that don't require a data package? And how about making those phones GOOD QUALITY and not the garbage you're providing for sale now?
    I'd like to stay with Verizon because they DO have the best service area compared to the other Big wireless phone companies . I've been with AT&T and the service area is really not as good. I even did comparison research for a college course's Final Project because I wanted to know which company had the more truthful claims about their coverage area. I live in the Sierra foothills and reception is a big issue with a lot of people here. I've talked with customers of the other companies and they all agree I should stay with Verizon. But if Verizon can't offer any good phones then I might as well switch to another carrier who may have more concern for their customer's needs. Even after giving us few choices of devices, charging high prices, and gauging us at every opportunity, they give us less incentive to stay a loyal customer. At least AT&T has roll-over minutes, Criket has 'no contract' service, and TracPhone has no contract/Pay as you go service with inexpensive phones.
    So please excuse me while I go to the other carriers' websites to see what they're offering...

    JF2mad wrote:
    Marilee
    You are spot on. Verizon wants to squeeze every dollar it can from it's customers and customer service can take a DISTANT back seat to greed. I have a Chocolate Touch which I got a few years ago(before it REQUIRED a data package). I had activated a different phone which I ended up not liking and when I tried to re-activate my Chocolate I got slammed with the required data crappage..er package. I had to fight to get the data BLOCKED and the phone activated again.
    Wifi is all over but all the phones that have Wifi that Verizon sells require data packages too. I REFUSE to give Verizon $30 minimum for data when I have no need for it. If they had ANY CLUE about customer service they would simply allow you to buy whatever phone you wanted from them(or others) and activate them with data blocks.Until they do I guess I will either be stuck with my old phones or go provider shopping. Which I am sure Budone will find great satisfaction in.
    On that topic. I am amazed that Versizon sits back and lets your apathy and disrespect of ANYONE who does not kowtow to the company line as they seem to. You are a "gold user" so clarly you are  some sort of Verzion demi-god. Gotta say, there is not one word of disrespect in what I wrote. I do not puke on anyone, BUT I do present a different view than you or the OP may be willing to see with your, "Woe is me, have pity on me, and let me do what I want, if not involve the government to protect myself from myself attitude."
    If you took time to look at posts of mine, I have had VZW in my crosshairs a few times, but they made it right. All in how you present yourself. Personally, I am willing to bet that corporate suck ups like you are the very reason many people finally DO make the break from Verizon. Your habit of puking all over people with legitimate complaints and the tolerance of same on this board lead ME and I am sure others to feel that your dismissive attitude is acceptable to, if not encouraged by Verizon in General. And yes, I have been edited in the past. But there was NOTHING in my prior post that has anything you state in it. You just dont like what I said. That is not towing the company line, that is knowing it costs a Ton of money to get a new deivce to market and data plans helps recoup the cost and allows upgrades and LTE to be deployed. You would be upset if you could not use your phone where there is NOT any WiFi.
    Even though I have had issues with Verizon's service and corporate paradigm(especially the companys determination to exthort every penny it can from me), almost every Service Rep I have spoken to has been considerate, friendly and helpful. There have been one or two who, when I broached the possible need for finding a new provider, seems unconcerned, but in general I have been made to feel valued by the people I have spoken with. {please keep your posts courteous}
    As for the demi-god comment, maybe I can get a MOD or ADMIN to change me from Gold to that! 

  • Query Error Information: Result set is too large; data retrieval ......

    Hi Experts,
    I got one problem with my query information. when Im executing my report and drill my info in my navigation panel, Instead of a table with values the message "Result set is too large; data retrieval restricted by configuration" appears. I already applied "Note 1127156 - Safety belt: Result set is too large". I imported Support Package 13 for SAP NetWeaver 7. 0 BI Java (BIIBC13_0.SCA / BIBASES13_0.SCA / BIWEBAPP13_0.SCA) and executed the program SAP_RSADMIN_MAINTAIN (in transaction SE38), with the object and the value like Note 1127156 says... but the problem still appears....
    what Should I be missing ??????  How can I fix this issue ????
    Thank you very much for helping me out..... (Any help would be rewarded)
    David Corté

    You may ask your basis guy to increase ESM buffer (rsdb/esm/buffersize_kb). Did you check the systems memory?
    Did you try to check the error dump using ST22 - Runtime error analysis?
    Edited by: ashok saha on Feb 27, 2008 10:27 PM

  • Creation of data packages due to large amount of datasets leads to problems

    Hi Experts,
    We have build our own generic extractor.
    When data packages (due to large amount of datasets) are created, different problems occur.
    For example:
    Datasets are now doubled and appear twice, one time in package one and a second time in package two. Since those datsets are not identical, information are lost while uploading those datasets to an ODS or Cube.
    What can I do? SAP will not help due to generic datasource.
    Any suggestion?
    BR,
    Thorsten

    Hi All,
    Thanks a million for your help.
    My conclusion from your answers are the following.
    a) Since the ODS is Standard - within transformation no datasets are deleted but aggregated.
    b) Uploading a huge amount of datasets is possible in two ways:
       b1) with selction criteria in InfoPackage and several uploads
       b2) without selction criteria in InfoPackage and therefore an automatic split of datasets in data packages
    c) both ways should have the same result within the ODS
    Ok. Thanks for that.
    So far I have only checked the data within PSA. In PSA number of datasets are not equal for variant b1 and b2.
    Guess this is normal technical behaviour of BI.
    I am fine when results in ODS are the same for b1 and b2.
    Have a nice day.
    BR,
    Thorsten

  • WAD : Result set is too large; data retrieval restricted by configuration

    Hi All,
    When trying to execute the web template by giving less restiction we are getting the below error :
    Result set is too large; data retrieval restricted by configuration
    Result set too large (758992 cells); data retrieval restricted by configuration (maximum = 500000 cells)
    But when we try to increase the number of restictions it is giving output. For example if we give fiscal period, company code ann Brand we are able to get output. But if we give fical period alone it it throwing the above error.
    Note : We are in SP18.
    Whether do we need to change some setting in configuration? If we yes where do we need to change or what else we need to do to remove this error
    Regards
    Karthik

    Hi Karthik,
    the standard setting for web templates is to display a maximum amount of 50.000 cells. The less you restrict your query the more data will be displayed in the report. If you want to display more than 50.000 cells the template will not be executed correctly.
    In general it is advisable to restrict the query as much as possible. The more data you display the worse your performance will be. If you have to display more data and you execute the query from query designer or if you use the standard template you can individually set the maximum amount of cells. This is described over  [here|Re: Bex Web 7.0 cells overflow].
    However I do not know if (and how) you can set the maximum amount of cells differently as a default setting for your template. This should be possible somehow I think, if you find a solution for this please let us know.
    Brgds,
    Marcel

  • Result set is too large; data retrieval restricted by configuration

    Hi,
    While executing query for a given period, 'Result set is too large; data retrieval restricted by configuration' message is getting displayed. I had searched in SDN and I had referred the following link:
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d047e1a1-ad5d-2c10-5cb1-f4ff99fc63c4&overridelayout=true
    Steps followed:
    1) Transaction Code SE38
    2) In the program field, entered the report name SAP_RSADMIN_MAINTAIN and Executed.
    3) For OBJECT, entered the following parameters: BICS_DA_RESULT_SET_LIMIT_MAX
    4) For VALUE, entered the value for the size of the result set, and then executed the program:
    After the said steps, the below message is displayed:
    OLD SETTING:
    OBJECT =                                VALUE =
    UPDATE failed because there is no record
    OBJECT = BICS_DA_RESULT_SET_LIMIT_MAX
    Similar message is displayed for Object: BICS_DA_RESULT_SET_LIMIT_DEF.
    Please let me know as to how to proceed on this.
    Thanks in advance.

    Thanks for the reply!
    The objects are not available in the RSADMIN table.

  • Result Set Too Large : Data Retrieval restricted by configuration

    Hi Guys,
    I get the above error when running a large dataset, with a hierarchy on - but when I run without a hierarchy I am able to show all data.
    The Basis guys increased the ESM buffer (rsdb/esm/buffersize_kb) but it still causes an issue.
    Anyone any ideas when it comes to reporting large volumes with a hierarchy?
    Much appreciated,
    Scott

    Hi there
    I logged a message on service marketplace andg got this reply from SAP:
    ' You might have to increase the value for parameters
    BICS_DA_RESULT_SET_LIMIT_DEF and BICS_DA_RESULT_SET_LIMIT_MAX as it
    seems that the result set is still too large. Please check your
    parameters as to how many data cells you should expect and set the
    parameter accordingly.
    The cells are the number of data points that would be send from abap
    to java. The zero suppression or parts of the result suppression are
    done afterwards. As a consequence of this, the number of displayed
    data cells might differ from the threshold that is effective.
    Starting with SPS 14 you get the information how many data cells are
    rejected. That gives you better ways to determine the right setting.
    Currently you need to raise the number to e.g. 2.000.000 to get all
    data.
    If BICS_DA_RESULT_SET_LIMIT_MAX is set to a lower value than
    BICS_DA_RESULT_SET_LIMIT_DEF, it would automatically cut the value of
    BICS_DA_RESULT_SET_LIMIT_DEF down to its own..
    Please note that altough this parameter can be increased via
    configuration, you should do a proper system sizing according to note
    927530 to ensure that the system can handle the number of users and
    resultset sizes you are expecting."
    Our basis team have subsequently apllied these changes, and I will be testing today.
    Thx

  • Requested buffer too large - but data is already in memory

    Hello all,
    I am writing a program that generates sound and then uses the Java Sound API to play it back over the speakers. Until recently, using clips have not led to any problems. On two computers I can play the sound without a hitch. However, on the newest computer (and also with the largest specs and especially more RAM), I am getting an error while trying to play back the sound. The exception that is thrown is:
    javax.sound.sampled.LineUnavailableException: Failed to allocate clip data: Requested buffer too large.
    I find this odd because the buffer already exists in memory: I don't have to read in a .wav file or anything because I am creating the audio during the course of my program's execution (this is also why I use Clips instead of streaming - the values are saved as doubles during the calculations and then converted into a byte array, which is the buffer that is used in the clip.open() method call). It has no problems allocating the double array, the byte array, or populating the byte array. It is only thrown during clip.open() call. I also find it strange that it would work on two other computers, both of which have less RAM (it runs fine on a machine with 512MB and 2GB of RAM, both XP 32-bit). The only difference is that the computer with the issue is running Windows 7 (the RTM build), 64-bit with 6GB of RAM. I am running it through Netbeans 6.7.1 with memory options set to use up to 512MB - but it's never gone up that far before. And I've checked the size of the buffer on all three computers and they are all the same.
    Does anyone know what the issue could be or how to resolve it? I am using JDK6 if that matters. Thank you for your time.
    Edited by: Sengin on Sep 18, 2009 9:40 PM

    Thanks for your answer. I'll try that.
    I figured it had something to do with Windows 7 since it technically hasn't been released yet (however I have the RTM version thanks to a group at my univeristy in cahoots with Microsoft which allows some students to get various Microsoft products for $12).
    Edit: I just changed the Clip to a SourceDataLine (and the few other necessary changes like changing the way the DataLine.Info object was created) and wrote the whole buffer into it, drained the line and then closed it. It works fine. I'll mark the question as answered, however that may not be the "correct" answer (perhaps it does have something to do with Windows 7 and not being completely tested yet). Thanks.
    Edited by: Sengin on Sep 21, 2009 8:44 PM
    Edited by: Sengin on Sep 21, 2009 8:46 PM

  • Warning:The EXPORT data cluster is too large for the application buffer.

    Hi Friends,
    I am getting following warning messages whenever I click on costing in "Accounting" tab.
    1. Costing data may not be up to date  Display Help
    2.  No costing variant exists in controlling scenario CPR0001
    3.  An error occurred in Accounting (system ID3DEV310)
    4. The EXPORT data cluster is too large for the application buffer.
      I can create project automatically from cprojects. PLan costs, budget and actuals maintain in WBS elements can be visible in cproject Object Links.
    Your reply is highly appreciated.
    Regards,
    Aryan

    Hi;
    Please, check the Note --> 1166365
    We are facing the same problem in R3, but apply this no fix it.
    Best regards.
    Mariano

  • HT4623 I want to purchase a ipad 32gb in the larger size but have read many bad reviews that the wifi does not stay connected and I don't want to have to buy another data package as I already own an iphone that i pay data for, what are my concerns

    I want to purchase an ipad 32gb in the larger size buy have read many bad reviews that they have problems staying connected to wifi.  I don't want to have to buy a data package so I only want the wifi one.  I already pay apple for my iphone data and I can't afford more money.  Why are there so many bad reviews and are the newer ones that much better, according to many reviews they are not.Please help

    I already pay apple for my iphone data
    You pay your carrier for data, not Apple.
    Why are there so many bad reviews
    Where? Regardless, there are always going to be a handful of people having some sort of issue with any given device, but I would not therefore assume that you would be one of them.

  • Conditional format with large data fails and show error as "Selection is too large" in Excel 2007

    I am facing a issue in paste special operation using conditional formats for large data in Excel 2007
    I have uploaded a file at below given location. 
    http://sdrv.ms/1fYC9qE
    The file contains two sheets, Sheet "Data" contains the data on which formats are to be applied and sheet "FormatTables" contains the format tables which contains conditional formating.
    There are two table in "FormatTables" sheet. Both have some conditional formats applied on it. 
    Case 1: 
    1. Select the table range of Table1 i.e $A$2:$AV$2
    2. Copy it
    3. Goto Sheet "Data" 
    4. Select data area i.e $A$1:$AV$20664
    5. Perform a paste special operation on full range and select "Formats" option while performing paste special.
    Result:
    It throws error as "Selection is too large"
    Case 2:
    1. Select the table range of Table2 i.e $A$5:$AV$5
    2. Copy it
    3. Goto Sheet "Data" 
    4. Select data area i.e $A$1:$AV$20664
    5. Perform a paste special operation on full range and select "Formats" option while performing paste special.
    Result:
    Formats get applied successfully.
    Both are the same format tables with same no of column and applied to same data range($A$1:$AV$20664) where one of the case works and another fails.
    The only diffrence is Table1 has appliesTo range($A$2:$T$2) as partial of total table range($A$2:$AV$2) whereas the Table2 has appliesTo range($A$5:$AV$5) same as of its total table range($A$5:$AV$5)
    NOTE : This issue is only in Excel 2007

    Excel 2007 No Supporting formating to take a formatting form another if source table has more then 16000 rows and if you want to do that in more then it then you have ot inset 1 more row in your format table to have 3 rows
    like: A1:AV3
    then try to copy that formating and apply
    Solution Case 1: 
    1.Select the table range of Table1 i.e AV21 and drage it down to one row down
    2. Select the table range of Table1 i.e $A$2:$AV$3
    3. Copy it
    4. Goto Sheet "Data" 
    5. Select data area i.e $A$1:$AV$20664
    6. Perform a paste special operation on full range and select "Formats" option while performing paste special

Maybe you are looking for

  • How can i access gmail's smtp server using java mail api

    i m using java mail api to access gmails pop and smtp service to receive and send mail from ur gmail account. I m able to access gmails pop server using the ssl and port 995 , but i can not use its smtp server to which i m connecting using ssl on 465

  • BPM fails due to error "Permanenter Fehler in der BPE Eingangsverarbeitung"

    hi all, i searched the forum and haven't got a proper solution for this. My scenario is that i am getting a message via a file adapter and sending it to BPM. In my BPM i am receiving this  message via a asyc receive step and then calling a SAP RFC vi

  • Internal drive for Macbook Pro 13 inch with Retina display from late 2013?

    Hi everyone. After some water spillage to my new computer, the guys at the Apple Store have told me that my internal hard drive needs to be replaced. No other damage was done, thankfully. I have a Macbook Pro 13 inch with Retina Display from late 201

  • SQL Broker Performance Reports are empty

    Good day, I'm running SCOM 2012 SP1. I was requested to deliver a couple reports for an SQL migration/upgrade. The relevant data is in The SQL Broker Performance Reports both for SQL 2008 and 2012. I have been looking at troubleshooting article such

  • About logical port used to call Web service

    Hi Experts: I found that logical port can be created either from "LPCONFIG" or "SOAMANAGER", They can't be found from the other transaction. What's the difference between them? Which one should be used in cases? Wayne