Size Restriction in backand and portal
HI All,
i have a query which contains more than 500.000 cells. modified the BICS_DA_RESULT_SET_LIMIT_MAX prameter in RSADMIN table (900.000), but the default on portal does not change. if i open the report from the pcd the bex web application designer always shows maximum 500.000 cells, i always have to modify manually.
i tried to follow the instruction in 'Activate Size Restriction for an Individual Query View', but there is no 'restriction for Restult Set tab' in the web application designer.
i also tried to modify the standard web template, but didn't find any option for it.
does anyone know, if i should modify this limit manually every time?
thanks in advance
Adrienn
<< Do not post the same question across a number of forums >>
Similar Messages
-
Size restrictions on arrays and vectors
Hi all!!!
I want to know whether there is any restriction on maximum size of array and vectors.
If I have thousands of records to be displayed on browser through JSP, can I initialise the arrays/vectors with that no.
Whether it will have adverse effect on performance.
Thanks in anticipationArray lists (and vectors) use an array as backing data store. And since array indices are ints, the maximum number of elements in an array list is Integer.MAX_VALUE (2 ^ 31 - 1).
About the performance: if you know how many elements you are going to add (albeit not precisely), you can create the array list with an initial capacity. This will improve performance because a new array is created when an element is to be added but the capacity has been reached.
Kind regards,
Levi -
JMS Adapter Maximum Message Size Restriction and Impact
Hi ,
We have business requirements,where the Maximum message size that needs
to be sent or received from around 25 to 30MB usig JMS XI channels.
But, "Note 856346 - J2EE JMS Adapter: Frequently Asked Questions (FAQ)"
specifies as below,
2.12)What are the message size restrictions in 640/700 release and also
710 releases?
Answer: The maximum message size for 640/700 releases is "10MB"
Please let us know,
1) If you see any issues with JMS XI Adapter handling 30MB messages in
Production systems.
2) If 30MB is huge, what is the maximum permissible message size
greater than 10MB as break point.
regards,
Amit1) If you see any issues with JMS XI Adapter handling 30MB messages in
Production systems.
If message size beyond 10MB then the message wont come to SXMB_MONI itself.....may be even in CC monitoring there may not be an entry....you mentioned Production Systems.....did you checked in Dev? The behavior wont change from Dev to Prod.
2) If 30MB is huge, what is the maximum permissible message size
greater than 10MB as break point.
10MB itself is the limit...anything above that wont be processed.
Regards,
Abhishek. -
I recently downloaded a series of instructional videos. They are in mp4 format and rather large (~500MB on average). When I try and open them with Quicktime (default) they will not play. Are there file size restrictions? I have gone through all of the most recent software updates. Any suggestions would be great. Thanks.
Try VLC Media Player. It has a reputation for playing just about anything you throw at it.
-
Output file size restrictions - HD Video for YouTube
For over a year YouTube has removed restrictions on file sizes for individuals that want to do so. The preset output in Premier Elements 11 for "Online - HD Video for YouTube 1920X1080" will not allow a video longer than 15 minutes. I need to remove this restriction and be able to produce files to upload to YouTube without file size restrictions. How can I do this?
TheBanjoNut
You cannot. And, Premiere Elements 12 has the same YouTube limitations as 11.
You need to export your Timeline to a file saved to the computer hard drive, and then upload that file to YouTube at the YouTube web site.
At YouTube, you would need to explore the opportunities that it offers in the way of "extended times". It may require special accounts.
Without further details of your Premiere Elements project preset and the properties of your source media, I would generalize and suggest that you look at the follow choices for creating this file for upload to YouTube at the YouTube web site...
Publish+Share
Computer
AVCHD
with Presets =
MP4 - H.264 1920 x 1080p30
or
YouTube Widescreen HD (changing the settings under the Advanced Button/Video Tab of the preset)
Please review. If you have any further questions on this, please do not hesitate to ask.
Thanks.
ATR -
Which MSS for erp2005 and Portal 7
This isn't as silly a question as it seems - honest
Our system is ERP 2005 and Portal 7.
According to the Portal Content Portfolio there is a Business Packagefor Manager Self-service (mySAP ERP) which is version 1.0 but has the following in the document
Primary Backend System SAP - ERP 2005
Portal Release NetWeaver 2004s (SP Stack 05 and higher)
Which fits us nicely. The size of the 4 .sca files are roughly 1 meg.
I loaded them but they didn't work
I also loade PCUI and the common parts business package.
Then upon investigation on service.sap.com/patches I find following the path
SAP Application Components -> SAP ERP -> SAP ERP 2005 -
> Entry by Component -> SAP XSS (Self Services)-> MSS600
Notice the different version number and the size of the 4 .sca files here are roughly 57meg.
Can anyone explain what the difference is between these to versions of MSS? Both are for ERP 2005?
When would you use one and not the other?
By default I usually get the ones from the Portal Content Portfolio (as I did on this occasion).
The same scenario is true also for ESS.
Can anyone explain this.
ThanksPls note an answer from SAP..
Business Package for Common Parts (mySAP ERP) 1.0
Business Package MSS (mySAP ERP) 1.0
You may downlaod them in the Service Marketplace or in the SDN.
Please find the version for the webdynpro applications below:
SAP MSS 600
SAP PCUI_GP 600 (People Centric User Interface Generic Parts)
You can download this in the Service Marketplace only and not in the
SDN. -
I was wondering what size restrictions there are on XML schemas? I'm developing a schema that has just raised the following error on registration.
ERROR at line 1:
ORA-31084: error while creating table "CAS"."swift564357_TAB" for element "swift564"
ORA-01792: maximum number of columns in a table or view is 1000
ORA-02310: exceeded maximum number of allowable columns in table
ORA-06512: at "XDB.DBMS_XMLSCHEMA_INT", line 0
ORA-06512: at "XDB.DBMS_XMLSCHEMA", line 151
ORA-06512: at line 828
On removing a few elements from the schema it registers fine, but querying the generated table swift564xxx_TAB there is only ever one column, typed with an ADT that itself only has 5 elements. In fact there doesn't seem to be, on the face of it, any type that has more than 20-30 elements. Where does this error come from then?
Unfortunately the schema exceeds the 20k limit on postings. I can split it up and post it in two parts if this would help.
Thanks
MarcEach attribute in the ADT and each attribute of attributes which are an ADT count as one column
Here's a snippet from the next version of the doc that may help...
3-20 Oracle XML DB Developer’s Guide, Rel. 1(10.1) Beta 2 Draft
A number of issues can arise when working with large, complex XML Schemas.
Sometimes the error "ORA-01792: maximum number of columns in a table or view
is 1000" will be ecountered when registering an XML Schema or creating a table
based on a global element defined by an XML Schema. This error occurs when an
attempt is made to create an XMLType table or column based on a global element
and the global element is defined as a complexType that contains a very large
number of element and attribute definitions.
The errors only occurs when creating an XMLType table or column that uses object
relational storage. When object relational storage is selected the XMLType is
persisted as a SQL Type. When a table or column is based on a SQL Type, each
Registering an XML Schema with Oracle XML DB
attribute defined by the Type counts as a column in the underlying table. If the SQL
Type contains attributes that are based on other SQL Types, the attributes defined
by those Types also count as columns in the underlying table. If the total number of
attributes in all the SQL types exceeds the Oracle limits of 1000 columns in a table
the storage table cannot be created.
This means that as the total number of elements and attributes defined by a
complexType approaches 1000, it is no longer possible to create a single Table that
can manage the SQL Objects generated when an instance of the Type is stored in the
database.
In order to resolve this problem it is necessary to reduce the total number of
attributes in the SQL Types that are used create the storage tables. Looking at the
schema there are two approaches that can be used to achieve this:
The first approach uses a ’top-down’ technique that uses multiple XMLType
tables to manage the XML documents. This technique reduces the number of
SQL attributes in the SQL Type heirarchy for a given storage table. As long as
none of the tables need manage more than 1000 attributes the problem is
resolved.
The second approach uses a ’bottom-up’ technique that reduces the number of
SQL attributes in the SQL Type herirarchy collapsing some of elements and
attributes defined by the XMLSchema so that they are stored as a single CLOB.
Both techniques rely on annotating the XML Schema to define how a particular
complexType will stored in the database.
In the case of the top down techniqueby the annotations SQLInline="false" and
defaultTable are used to force some sub-elements within the XML Document to
be stored as rows in a seperate XMLType table. Oracle XML DB maitains the
relationship between the two tables using a REF of XMLType Good candidates
for this approach are XML Schemas that define a choice where each element
within the choice is defined as a complexType, or where the XML Schema
defines an element based on a complexType that contains a very large number
of element and attribute definitions.
The bottom up technique involves reducing the total number of attributes in the
SQL object types by choosing to store some of the lower level complexTypes as
CLOBs, rather than objects. This is acieved by annotating the complexType or
the usage of the complexType with SQLType="CLOB".
Which technique is best depends on the application, and the kind of queries and
updates that need to be performed against the data. -
Quicktime 7 file size restriction?
I have some large files shot on HDV and Imported into Final Cut Pro X. I want to use FCP 7 Pro to split them and export in a different format. The larger files 8-10GB appear in QT7 with sound only and a black screen. Smaller files 3-5GB play as expected. My computer is a MacPro 2x2.26 with 20GB RAM. These files WILL play in QTX, but I don't have the export options of QT Pro.
This leads me to think that there must be a file size restriction for QT Pro 7. Does anyone know what it is?
FYI the detaile of the files I'm trying to play: HDV 1080i50 1440x18080 (1880x1062) Millions. I have Quicktime Pro 7.6.6. My work-sround has been to split and re-compress in Compressor, but QT7 has always been the quick and dirty solution.Hi,
There is no restrictions on content server side but it will depends on your organization network latency, speed and etc.
FYI: We have checked in content around 2GB of data so for.
Thanks,
Ravinder -
Picture resolution or message size restriction
I have an iPhone 3G and am trying to receive picture messages via MMS. I keep getting a message that says, "The media content was not included due to a picture resolution or message size restriction." Is this happening because of my iPhone or because of my cellular carrier? Is there a way to fix this? Thanks!
Alec S. wrote:
"The media content was not included due to a picture resolution or message size restriction." Is this happening because of my iPhone or because of my cellular carrier?
Your carrier.
Is there a way to fix this?
Sorry, no. All carriers/ISP have size restrictions. -
Remove Message Size Restriction
Hello,
Please suggest a powershell to achieve below two tasks for bulk mailboxes (in .csv), NOT for individual mailbox;
1) To apply Message Size Restriction
2) To remove Message Size Restriction, so it will utilize default
ThanksHi,
I edit Johnpaul’s command to set message size limits not quotas.
To apply Message Size Restriction
Import-Csv "C:\temp\users.csv" | Foreach-object {Set-Mailbox -Identity $_.name -MaxReceiveSize 1GB –MaxSendSize 2GB}
To remove Message Size Restriction, we just set the value to unlimited.
Import-Csv "C:\temp\users.csv" | Foreach-object {Set-Mailbox -Identity $_.name -MaxReceiveSize unlimited–MaxSendSize unlimited}
By the way, message size limit can be set on the following levels:
Organizational Level
Send Connector
Receive Connector
AD Site Links
Routing Group Connectors
Individual
The path evaluated is as follows: User Send Limit > Receive Connector > Organization Checks > Send Connector > User Receive Limit
For more information, please refer to this document:
https://technet.microsoft.com/en-us/library/bb124345(v=exchg.141).aspx
Best Regards.
Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected]
Lynn-Li
TechNet Community Support -
FIM Service and Portal Installation Ends Prematurely
Hello All,
I'm in the process of setting up a new production FIM 2010 R2 server. I have already installed the FIM synchronization service and I was able to install this successfully.
I have already installed SharePoint services (WSS 3.0) and configured it for FIM. But when I try to install the FIM Service and Portal.
I keep getting and error that says " FIM Service and Portal Installation Ends Prematurely" with no other details. If anybody has any advice please let me know.
I have already installed everything on a stand alone box in a dev environment and it all works correctly however I am unable to now install in a production environmentCameron is right - you should consider FIMService account nearly as normal AD account for user*
An installer account should be admin on the box, where you are installing FIMService and he should be sysadmin on SQL during installation of FIMService. An installer account should be other than FIMService account itself.
*FIMService account:
Lock down the Service Account
The service account should not be used by any other services or users. The account must not be used to logon interactively and requires no access to any additional resources beyond those granted during setup. The service account is used to provide the security
context for the MIIS service as it accesses resources on the MIIS server and the associated database. It also provides the security context for the execution of any rules extensions.
Lock down the service account to ensure no malicious user is able to sign in using its credentials and gain access to MIIS data. Configure Group Policies to lock down the account and restrict access to this account. Since the MIIS service only needs the account
to run as a service, restrict the account as follows:
Deny logon as a batch job.
Deny logon locally.
Deny logon through Terminal Services.
Deny access to this computer from the network.
If you found my post helpful, please give it a Helpful vote. If it answered your question, remember to mark it as an Answer.
Good topic for a Wiki article:
http://social.technet.microsoft.com/wiki/contents/articles/23330.technet-guru-contributions-for-march.aspx
Thanks Dominic and Jose!
Ed Price, Power BI & SQL Server Customer Program Manager (Blog,
Small Basic,
Wiki Ninjas,
Wiki)
Answer an interesting question?
Create a wiki article about it! -
FileInfo.CopyTo Size Restrictions
Are there file size restrictions in using the FileInfo.CopyTo function?
I have a Service which sorts files around on a machine and I tend to see failures in very large file sizes. The routine moves files from one array to another.
If I open File Explorer and copy the file manually I do not get an error.
The Event log under System will show a message:
{Delayed Write Failed} Windows was unable to save all the data for the file. The data has been lost. This error may be caused by a failure of your computer
hardware or network connection. Source Ntfs. Category None
Followed by:
Application popup
Windows - Delayed Write Failed. Windows was unable to save all data for the file xxxxxx. The data has been lost. Source Application popup. Category None.
The file today is about 50GB in size and is moving from one SAS array to another. I only get these failures in very large files (larger than 5GB) and of course
it is not consistent. The files alway work fine in file explorer which would lead me to believe that it is not a hardware issue.
So to make the question simple, is there a time-out period in the CopyTo routine that would cause it to decide that the copy has failed prematurely? Is there a better
routine I should be using to move the files around on the system?
I'm not seeing an exception being logged by the Service under Application during this time.
I appreciate any information on this topic.
Thanks,
MikeThe error is coming from the file system driver. This isn't a .NET issue. The failure is occurring when the driver is trying to write to the drive (it uses delay writing to optimize writes) and that is failing. File.CopyTo does nothing but verify
the path and then call Win32's CopyFile API. Note that Windows Explorer more likely uses CopyFileEx instead. They are similar but CFE provides a few additional options useful in UIs. There is no .NET wrapper around this call. -
CRM Application Server Java and Portal components on same Java stack
Hi Team,
We are currently performing solution design and want some information on CRM install.
Want to know if there is any restriction for installing CRM Application Server Java and Portal components on same Java stack(With same SID). Also want to know if SAP WEB UIF 7.0 can work similar to that of Portal for CRM applications?
Any information around this would be appreciated.
ThanksDo You have any servlets or filters defined using annotations? Maybe one of these servlets/filters catches the '/login' request first. I have very similar solution and it works fine. Also, how do You check that the servlet under '/login' url pattern isn't invoked?
-
Size restrictions for USB memory sticks?
Hi out there,
are there any known size restrictions for the use of USB memory sticks with Mac OS X?
I bought two 32 GByte (noname) USB memory sticks to "mobilize" my iTunes-library. I formatted it without errors (believing what the DiskUtility-application told me) and copied my files to one stick. But only about 4 GByte of data where able to be opened after copying. The other files where shown in the directories with their correct sizes (I compared them to the originals on my harddisk) but couldn'd be opened.
I tried copying them a second time, using the second 32 GByte stick - but same behaviour as on stick one.
Next I tried to format both sticks with the DiskUtility using the option "writing zeros in seven runs" and then tested data integrity with the TechTool-Pro application (surface test, random read and write). Both showed no errors on the sticks. But copying my library again to the sticks showed no diffrent behaviour as before. The first few copied GByte (I think about 4 GByte) could be opened by iTunes, but the rest "is only there".
Any idea what's going wrong?
Greeting from Germany
AnsgarHello Ansgar,
sounds like one of those faked 32GB Sticks.
http://www.digitv-forum.co.uk/viewtopic.php?t=3680 -
SSO between Portal Application and Portal Admin Tool
Hi All,
We have a requirement for implementing SSO between a Portal application and
Portal admin tool.
We are using WL Portal 8.1 SP4.
Here is the reason for this requirement -
A user logged-into Portal Application needs to login to Portal Admin tool to
do some admin activity. We want to provide a link in the portal application
using which the user can directly login to the Portal Admin tool without
having to enter the credentials again.
If someone has any info on how to implement this, can you please point me in
the right direction.
Thanks,
~DeepakHi,
When creating PP you have 2 options
PP used for compiling and PP used for Building
You create PP with all the libraries into Developing/Compiling Other DCs
And another PP with all the libraries into can be packaged into other build results (SDAs).
Once you have these 2 PP in place you add the DC as used DC.
And this should resolve the issue.
Hope this helps.
Cheers-
Pramod
Maybe you are looking for
-
No longer able to use non-apple lightning to 30 pin adapter cable with IOS 8
Do any of you use an Apple iDevice with a digital transport docking station to get your music from your device to your speakers using an external DAC? I have an Onkyo ND-S1 which was great when using my iPhone 4S plugged into the dock sending digital
-
Premiere Pro/Media Encoder fail when exporting something with AE linked?
Hey all, I do some video production work for a university, and we're running into some problems when trying to export all the videos from a particularly large course. We haven't run into this issue before so I'm hoping someone might have knowledge o
-
Unable to connect to server. try again
Hello, since Friday, for the big folio (more 100 MO), I have the message : Unable to connect to server. Try again Yet. any solution,
-
I did not get a CD with my refeberished airport express
do I need the CD for airport express or can i download it, it did not come with my order
-
Traffic policy in pps for through the router data
Hello. I need to construct an inbound traffic policy for a router to limit pps. The traffic is actually through the router traffic destined for hosts further in the network. What I am seeing is that pps can only be applied to control plane policies a