Meaning of E-table dealing with Compress

hi all
I know that during the Infocube compression the number of records in the F-table decrease because those records are switched in the E-table.
What's the meaning of E-table? what is the difference between E and F table?
thanks a lot for help
Best regards

If in you F-table you have
REQUEST_ID-DOCUMENTNUMBER...
001----
100...
001----
100...
001----
100...
in E-table (by suppressing the dimension request ID) you will have only one record:
DOCUMENTNUMBER
100
This is due to the elimination of a dimension !
Hope now is clearer...
Bye,
Roberto
(take a look to these tables directly in SE16/SE11: surely you will understand better the whole concept !)

Similar Messages

  • Can ui:table deal with large table?

    I have found h:dataTable can do pagination because it's data source is just a DataModel. But ui:table's datasouce is a data provider which looks some complex and confused.
    I have a large table and I want to load the data on-demand . So I try to implement a provider. But soon I found that the ui:table may be load all data from provider always.
    In TableRowGroup.java, there are many code such as:
    provider.getRowKeys(rowCount, null);
    null will let provider load all data.
    So ui:table can NOT deal with large table!?
    thx.
    fan

    But ui:table just uses TableDataProvider interface.TableData provider is a wrapper for the CachedRowSet
    There are two layers between the UI:Table comonent and the database table: the RowSet layer and the Data Provider layer. The RowSet layer makes the connection to the database, executes the queries, and manages the result set. The Data Provider layer provides a common interface for accessing many types of data, from rowsets, to Array objects, to Enterprise JavaBeans objects.
    Typically, the only time that you work with the RowSet object is when you need to set query parameters. In most other cases, you should use the Data Provider to access and manipulate the data.
    What can a CachedRowSet (or CachedRowSetprovider?)
    do?Check out the API that I pointed you to to see what you can do with a CachedRowSet
    Does the Table cache the data itself?
    Maybe this way is convenient for filter and order?
    Thx.I do not know the answers to these questions.

  • TS1814 I tried updating my Iphone with ITunes & it told me:   You canot download because  Apple Moble Device service is not started.  what does that mean & how do I deal with this??

    I tried updating my IPhone with Itunes & it stated: You cannot complete your download because Apple Mobile Device service is not started....What does that mean & how do I fix it or get around it??
    Rosie

    Type "Apple Mobile Device service" into the search bar at the top of this screen by Support and read the resulting help article

  • XSU: Dealing with large tables / large XML files

    Hi,
    I'm trying to generate a XML file from a "large" table (about 7 million lines, 512Mbytes of storage) by means of XSU. I get into "java.lang.OutOfMemoryError" even after raising the heap size up to 1 Gbyte (option -Xmx1024m of the java cmd line).
    For the moment, I'm involved in an evaluation process. But in a near future, our applications are likely to deal with large amount of XML data, (typically hundreds of Mbytes of storage, which means possibly Gbytes of XML code), both in updating/inserting data and producing XML streams from existing data in relationnal DB.
    Any ideas about memory issues regarding XSU? Should we consider to use XMLType instead of "classical" relational tables loaded/unloaded by means of XSU?
    Any hint appreciated.
    Regards,
    /Hervi QUENIVET
    P.S. our environment is Linux red hat 7.3 and Oracle 9.2.0.1 server

    Hi,
    I'm trying to generate a XML file from a "large" table (about 7 million lines, 512Mbytes of storage) by means of XSU. I get into "java.lang.OutOfMemoryError" even after raising the heap size up to 1 Gbyte (option -Xmx1024m of the java cmd line).
    For the moment, I'm involved in an evaluation process. But in a near future, our applications are likely to deal with large amount of XML data, (typically hundreds of Mbytes of storage, which means possibly Gbytes of XML code), both in updating/inserting data and producing XML streams from existing data in relationnal DB.
    Any ideas about memory issues regarding XSU? Should we consider to use XMLType instead of "classical" relational tables loaded/unloaded by means of XSU?
    Any hint appreciated.
    Regards,
    /Hervi QUENIVET
    P.S. our environment is Linux red hat 7.3 and Oracle 9.2.0.1 server Try to split the XML before you process it. You can take look into XMLDocumentSplitter explained in Building Oracle XML Applications Book By Steven Meunch.
    The other alternative is write your own SAX handler and send the chuncks of XML for insert

  • How to deal with deadlock on wwv_flow_data table when http server times out

    There are some threads about a deadlock on the wwv_flow_data table. None of them contain a real explanation for this behaviour. In my case I will try to explain what I think is happening. Maybe it helps somebody who is hitting the same matter.
    In my case with APEX 3.2.1 I am navigating from one page to another. Doing this APEX will lock the table wwv_flow_data. As soon as the other page is shown the lock will be released. But now this other page contains a bad performing query (standaard report region). After 5 minutes the http server (modplsql) will time out and present the message "No response from the application server" on the screen. In the meanwhile the query is still running on the database server and the lock stays on the wwv_flow_data table.
    Normal user behaviour will be that the user will use the back button to return to the previous page and tries it again to navigate to the other page or
    the user will try to refresh the page with the bad performing query.
    And voila now you will have a deadlock on the wwv_flow_data table since a second session is trying to do the same thing while the first hasn't finished yet.
    How to deal with it?
    First of all. Have a good look at the bad performing query. Maybe you can improve it that it will succeed before the http server will timeout.
    In my case the 11gr1 optimizer couldn't handle a subquery factoring clause in the best way. After changing it back to a classical inline query the problem was solved.
    Secondly you could increase the timeout parameter of the http server. Although this not the best way.
    Maybe it would better if APEX in a next version would release the lock on the table wwv_flow_date earlier or do a rollback just before the moment that the http server is timing out.
    regards,
    Mathieu Meeuwissen

    Hello Shmoove,
    I saw your reply here and you probably understand the problems the HTTP 100 response may cause.
    I am trying to send image that was taken by getSnapshot. The problem is that the server respond with this HTTP 100 message.
    I suspect that the reason that my server doesn't recognize the file that I'm sending from J2me is that the "server to client" response to the 100 message comes after the second message of (see what the TCPIP viewer shows down here):
    POST /up01/up02.aspx HTTP/1.1
    Content-Type: multipart/form-data; boundary=xxxxyyyyzzz
    Connection: Keep-Alive
    Content-length: 6294
    User-Agent: UNTRUSTED/1.0
    Host: szekely.dnsalias.com:80
    Transfer-Encoding: chunked
    400: Client to Server (126 bytes)
    78
    --xxxxyyyyzzz
    Content-Disposition: form-data; name="pic"; filename="david.jpg"
    Content-Type: application/octet-stream
    400: Connected to Server
    400: Server to Client (112 bytes)
    HTTP/1.1 100 Continue
    Server: Microsoft-IIS/5.1
    Date: Wed, 23 Mar 2005 00:47:02 GMT
    X-Powered-By: ASP.NET
    Any help will be appreciated,
    David

  • I opened a file on my desktop that I don't remember putting there.  It turned out to be a keychain certificate from a client of ours.  Does this mean that they were spying on me?  What is the deal with that?  Any ideas?

    I opened a file on my desktop that I don't remember putting there. We use many photos and I thought it was a photo file I was looking for. It turned out to be a keychain certificate from a client of ours.  Does this mean that they were spying on me?  What is the deal with that?  Any ideas?

    Interesting tid bit.  I created an AAC of the original file, deleted the original MP3 from my library and also deleted the Clean matched track from the icloud.
    Result is that it matched with the explicit version of Mrs. Officer this time.
    What I am curious about is which songs this is happening for. I've went thru a few batched of about 500 songs at a time and redownloaded in 256k for many tracks. Sadly we don't have people to bring this to our attention and I have so much music that it's impossible to go thru every song to make sure I am getting the right version.

  • How to deal with the growing table?

    Every tables are growing in any applications. In some applications, tables become larger and larger quickly. How to deal with this problem?
    I am developing an application system now. Should I add a lot of delete commands in the code for each table?

    junez wrote:
    Every tables are growing in any applications. In some applications, tables become larger and larger quickly. How to deal with this problem?
    I am developing an application system now. Should I add a lot of delete commands in the code for each table?Uh, well, yes if you continually add rows to a table the table will grow ... sooner or later you will want to delete rows that are no longer needed. What did you expect? You have to decide what the business rules are to determine when a row can be deleted, and make sure your design allows for such rows to be identified. This is called ..... analysis and design.

  • Dealing with DB table-entries in ABAP OO

    Hi everybody,
    in ABAP-Reports (normally) we have deal a lot with DB table entries.
    Is there a "state of the art" for doing this with ABAP OO?
    An easy example:
    Asuming I want to select table  entries into an
    internal table from BUT000.And than I want to write: the entries.
    How would this look like in ABAP OO?
    show the internal ZTable be my object?
    or should every BUT000-Table-record be my object?
    Regards Mario
    null
    Message was edited by:
            Mario Müller

    Hello Mario,
    A very good question. This is what is called as modeling.
    I shall give you an approach into this. There is no right and wrong way of doing it. Probably, only a more desriable or better way !!
    How you model it depends on what your object is.
    If you have an internal table of sales orders, still in real world the services or methods are on a single sales order. So the object here is a single sales order.
    => I would model the class to deal with one sales order. (this more or less answers your question)
    2. Just to take this a little further.
    What I would do is have 3 different layers of abstraction.
    A UI class, a business layer class and a DB class.
    UI class can only talk to business class, business can talk to DB class. And the DB class is a static class.
    The UI class is only responsbile to do the display job.
    The UI will display multiple sales orders for eg, so u have an internal table of instances to the business layer.
    The business layer itself does validation, processing for each sales order.
    The business layer can also have soem static methods (or class methods) to select multiple records from Database. This is static, as it is not acting on one sales order, but you are returning multiple. This method should simply call a method of the DB layer.
    The DB layer is meant only to read from DB and write to DB.
    Hope this helps. Remember to reward points, if it does.
    For more highlights into this, refer to some material on design approach or design pattern.
    Best Rgds,
    Prashanth.
    SAP.

  • Best way to deal with Mutating table exception with Row Level Triggers

    Hello,
    It seems to be that the best way to deal with Mutating Table exception(s) is to have to put all the trigger code in a package & use it in conjunction with a Statement level trigger .
    This sounds quite cumbersome to me . I wonder is there any alternative to dealing with Mutating table exceptions ?
    With Regards

    AskTom has a good article about this,
    http://asktom.oracle.com/tkyte/Mutate/index.html

  • Dealing with Denormalized Tables - jdev 11.1.2.3 redhat 5.8

    Hello:
    In a Legacy oracle system, we have many tables that are not in 3rd normal form. i.e. they are denormalized.
    Example: A student has many teachers and a teacher has many students... a classic many-to-many. The normal solution would be to create an association table to hold the intersection.
    However in a legacy system, we have the student table that has columns of teacher1, teacher2, teacher 3, teacher4, teacher5, teacher6, teacher7 and no association table.
    Without modifying the table structures, what is the best way to deal with this in a ADF application. My guess would be to create 7 view links and write code to implement
    any logical rules.
    Does anyone have other ways around this problem?
    Thanks for the help.

    Hi,
    how about using a database view
    e.g.
    select student_id, teacher1 from your_table
    UNION
    select student_id, teacher2 from your_table
    UNION
    select student_id, teacher3 from your_table
    Create an entity and view object based on the database view. If you need teachers to be updateable, create stored procedures and override the entity object doDML method for insert/update/delete
    Frank

  • Requesting guidance on how best to deal with removal of CreateElementSteps for tables when option to 'script validation for new constraints' is enabled during schema compare

    I have a DeploymentPlanModifer subclass that is responsible for removing certain tables from a deployment plan under specific conditions. It is relatively trivial to find the
    CreateElementSteps I need and subsequently remove them via
    DeploymentPlanModifier.Remove(), but...
    ... if in my comparison I have enabled the 'Script validation for new constraints' option, the deployment plan will contain a
    DeploymentScriptDomStep with a Batch containing AlterTableConstraintModificationStatements for tables with foreign key constraints. My problem starts here - there will be an orphaned
    AlterTableConstraintModificationStatement for each of the tables that I removed. Obviously, execution of the generated script comes to a grinding halt when asking SQL Server to to alter a table that is never created.
    I'm able to get around this by digging around in the aforementioned batches and removing the orphaned alter statements, but this seems really hacky, which makes me think I'm missing the proper way of dealing with this.
    So... if anyone is aware of a more correct way of avoiding this problem, I would really appreciate finding out more about it.
    Thanks in advance for any help. :-)

    Hi Greg. Unfortunately there is not an easy solution here. Walking the deployment model, spotting potential issues and excluding them really is what you have as an option here. The alternative is to pre-process the dacpac to have it in the form you want,
    but I'm not sure if this is an option in your case and it also has limitations.
    Regards,
    Kevin

  • How does lightroom 4 deal with lossless compressed raw files

    how does lightroom 4 deal with lossless compressed raw files

    Ok, sorry about the vague question.  My concern is centered around a new camera purchase ( Nikon D810 ) which produces huge files.  I shoot in RAW and never use lossless compression only uncompressed RAW because of my conservative nature .  But now it looks like I will be using Lossless compressed to help deal with the file size.  So the question was based on fears of data lost.  Thanks for the replies, and they confirm what I already suspected. So now I can feel confident using lossless compression.
    Thanks

  • Export Issues with Compressed Partition Tables?

    We recently partitioned and compressed some large tables. It appears, but I'm not sure yet, that this is causing the export to run extremely slow. The database is at 10.2.0.2 and we are using the exp utility, not datapump. Does anyone know of any known issues with using exp to export compressed, partitioned tables?

    can you give more details of the table structure with dbms_metadata if possible, and how you are taking the export please?
    did you try to take an sql*trace of the export process to see what is going on behind, this is an introduction if you may need;
    http://tonguc.wordpress.com/2006/12/30/introduction-to-oracle-trace-utulity-and-understanding-the-fundamental-performance-equation/

  • Imovie is limited to its import formats and does not allow for storing video in any rawformat.  Which means all videos will contain lossy compression applied to the video-.with no way to edit full format video.  Is this statement true?

    Imovie is limited to its import formats and does not allow for storing video in any rawformat.  Which means all videos will contain lossy compression applied to the video….with no way to edit full format video.  Is this statement true? Does iMovie support HD 1080p @ 60 fps?

    Iggy826 wrote:
    …   Is this statement true?
    yes and no!
    using the intended Import from Camera routines, iM converts automatically in its very own AppleIntermediateCodec. so, answer is yes.
    but …
    a) Apple claims , aic is non-lossy intermediate codec. working with proRes in FCPX taught me, there are even less non-lossy-ness codecs
    b) iM offers an Archive feature, which is basically a simple Finder/copy operation which 'clones' your SDcards content into some folder on your harddrive; e.g. you can later use these untouched  'raws' in another editor such as FCPX. so, answer is no.
    c) when you 'override' the import routines by manually re-wrapping mts into a mov container, iM handles the 'native' h264s ... so, answer is no.
    d) adding any effect, transition etc. reduces any interlaced source to 540p. (if you're working with 720p source, res is kept) … so, answer is yes. or, in 720p case, no.
    < Johnny Depp's voice >… savvy?
    Does iMovie support HD 1080p @ 60 fps?
    via Import from Camera? No.
    via re-wrapped mts>>mov? I was told yes, some say no.
    one thing to be kept in mind when talking about these 'issues':
    iM, this 12€, is meant as a CONSUMER toy tool.
    it supports AVCHD vers1 (=no 1080/60p, no >24Mbps, no 3D).
    fingers crossed, we'll see soon some iM/QT/FCPX update for support of AVCHD v2 ........
    the main consumer devices are within the specs, iM supports.
    using professional equipments makes usage of professional software (FCPX, AP) optional.
    sorry for lengthy answer.

  • Issues dealing with the Qualified tables

    Hi,
    I am facing below issues while dealing with the Qualified tables.
    Issue # 1. Trying to get the data from the Qualified table through the Main table. Qualified table is set as supporting ResultDefinition. I am able to get the value for only the non-qualifier fields but not the qualifier fields.
    Exception: com.sap.mdm.commands.CommandException: com.sap.mdm.internal.protocol.manual.ServerException: Qualifier values are not part of a qualified lookup record
    Issue # 2. Need to define the search on the fields of a Qualified table with Main table in the ResultDefinition. I am able to define the search on qualifier fields but not the non-qualifier fields.
    Exception:com.sap.mdm.commands.CommandException: com.sap.mdm.internal.protocol.manual.ServerException: Field not found
    I would like to know the standard ways to address these issues.
    Thanks,
    Surendra

    Hello,
    I was wondering if this issue is solved now. I'm still facing the problem when executing the RetrieveLimitedQualifierValuesCommand:
    com.sap.mdm.commands.CommandException: com.sap.mdm.internal.protocol.manual.ServerException: Field not found
    Are there other ways to manipulate qualifier field values? Workarounds... alternatives...
    Is it related to the MDM Java API version?  If yes, is there a fix?
    Thanks a lot for your input!
    Regards,
    Pedro

Maybe you are looking for

  • Authorization issue BI 7.0

    Dear all, in our dev-system we have the authorization SAP_ALL and S_A.SYSTEM. Which profile do we need furthermore regarding BI 7.0, because eg. we get authoritazion error for infosource (democube) 0D_DECU and can not open queries. Thanks in advance

  • New to this how do I convert a PDF to word

    I am new to this and I am trying to convert PDF to word.  I have looked around and it keeps taking me to purchase which I have done.

  • Problem linking to MobileMe album created in Aperture

    I use multiple computers and I'm trying to sort out synchronising photo albums between them. On my main iMac I use Aperture and have created and published a number of photo albums to the gallery. I can view the albums from any machine via the Safari

  • Custom Sync Problem (Unable to customize SYNC)

    I am trying to transfer ONLY few iPhoto events from my computer to Apple TV. iTunes, though, ignores any changes in SYNC settings and no matter what I check or uncheck to synchronize; the results is the same. It starts syncing everything, TV shows, m

  • Levenberg Marquardt: how to use a decimal as an exponent? it only works with integers!

    Hi, I use LV 6.1 and I want to use the Levenberg-Marquardt algorithm to extract the best fit coefficients for the following equation: Y =A*(X-B)^(1.7) The data I input are X and Y (that I know)  and I expect the vi to display A and B. I also input ap