Levels and more.

Hello,
I am here today to ask you what the ranks/levels are on these support communities. If you could provide me with a link or something thatd be great   Also wondering if there is a way to get a signature here.
Shananigan

Shananigan wrote:
is there a way to set it as default
I don't think so, but perhaps someone will chip in here with a way. I use the Camera icon to insert my signature image:
Shananigan wrote:
how did you make that
See this post.

Similar Messages

  • Dunning Old dinning level and dunning level

    Hi experts,
    I have one issue in dunning. the issue is where the dunning history for particular customer shows as old dunning level as 3 and dunning level as 2.
    i have checked  the customer master there is change in the dunning procdure. But the document and master data shows the correct dunning level with the new procdure.
    so i didnt understand what is the OLD dunning level and dunning level  for the same cusomer. the issue is only with one customer.
    I have also checked with old procdure with same customer in table MHNK. there are some notices done. but how come the current invoice shows the old dunning level for the same invoice.
    can any once explain me how to proceed.
    Regards
    Ashok

    Hi
    Thnks for the reply so we need to apply this note for making the old dunning level and dunning level makes equal after you run the program.
    Also one more doubt, since i have observed that the dunning after the parameters entered. looks like this
    Parameters were maintained
    Dunning selection executed, job deleted
    Dunning notice printed, job deleted.
    the above doesnt means that they will update the old dunning level right.
    Regards
    Ashok

  • How to add three fields in Sales order item level and supress/hide many

    Hi Gurus,
    My client requirement is :
    1. Three fields to be added at Sales order Item level and they should flow till billing.
    2. Supress/hide most of the fields in Sales order, so that end user will be happy( thru SHDO how to do)
    Please share your experiences and help me.
    BEST REGARDS
    Srikanth

    Hi Subba Rao
    in VA01 screen - Material /qty/ after entering this client wants to enter three more details say X/Y/Z
    and after that he dont want to see maximum fields displayed in VA01 Screen.
    I think it makes sense.
    Regards
    Srikanth

  • Difference between Low level and Normal IMAQ ?

    There are two options in IMAQ palette a low level and a normal one. Both have the same function names and the same icons the only difference being the balck and white icon of low level functions and colored icons of the normal functions. What is the difference between these two and what is the need of making a low level palette of functions ?
    Regards
    Asad Tirmizi
    Design Engineer
    Institute of Avionics and Aeronautics
    " Its never too late to be, what u want to be"
    Using LabVIEW 8.2

    Low level imaq functions gives you relatively more function to control the hardware. Buffer functions is one which is not there on the high level palatte

  • DB Copy (MSSQL) between two SAP systems with different level and Components

    Hi,
    we have a SAP system release mysap 2004 SR1 and for "the upgrade project  to SAP ECC 6.0 " we installed a new mysap 2004 in a new hardware (sap03) where we have also solution manager running. The new installed Mysap 2004 SR1 is alos running fine in the new hardware.
    Now we would like to do DB copy from the old system (sap02) to the the new system sap03 and then we will do the upgrade to SAP ECC 6.0.
    After I compared the two systems I have following questions:
    1) there are 4 Software Components which are not in the new system (sap03). This are:
    PI
    FIN_BLERP
    BP_INSTRASS
    Is this necesary for the DB copy or we do not need this to be able to do the DB copy?
    2) The Software Components Level is not the same. Should we have the same level to do the DB Copy?
    Thanks in advance
    Best regards
    HanseAtik

    Hi Juan, hi Clas,
    just to be clear and to not stuck on the the way:
    The systems are as follow:
    1) the source system:
                         - Windows Server 2003,
                         - MSSQL Server 2005 with 32 bit.
                         - Mysap 2004 SR1
    2) Target system:
                       - Windows Server 2003
                       - MSSQL Server 2005, with 64 bit.
                       - Mysap 2004 SR1 (with less Support package level and no PI, FIN_Basis, BP_BLERP and BP_INTASS coponents)
    I hope this will work. could you post the link the DB Copy guide we are talking about?
    By this way we will be sure that we are talking about the same DB Copy. Otherwise any mistake in this step will cost more work and time.
    Thanks in advance
    Best regards
    HanseAtik

  • Facts at different Logical Dimension Level and Default Interaction (Drill)

    I have one Geography Logical Dimension that consists of 3 physical tables snowflaked. The 3 tables are City, State and Country. I have created 3 Level Based Hierarchies.
    I have 2 Facts. Fact_State is at the State grain only (Set the appropriate Content Level). Fact_City is at the City grain (Set the appropriate Content Level).
    When I pull a query with Country and a metric from Fact_State, the default interaction allows me to drill down. I drill into the State values. I would assume that the Drill interaction would stop here since this metric (Fact_State) is only at this level.
    What actually occurs is it allows for a drill into City and when you look at the Physical SQL it is trying to hit the other Fact Table (Fact_City) and pulls back no metrics. I then return no data.
    I would ask does anyone know if this is the expected behaviour? Is there anyway to stop the drill at the State Level when metrics are only pulled from the State Fact?
    Please let me know if more information is needed.

    You need to have a metric sourcing from both Fact_State and Fact_City, provided both facts are as Logical sources for the logical fact table.
    Make sure Fact_State got proper level and Fact_City got detail level.
    When you pull Country or State with metric query should hit Fact_State, when you drill down below the state query should be from Fact_City with proper data.
    Edited by: Srini VEERAVALLI on Mar 28, 2013 2:43 PM
    I can look into your new post only If you update this or older posts. Still you may get responses from others ;)
    Edited by: Srini VEERAVALLI on May 22, 2013 2:00 PM

  • Making vendor down payment at PO header level and not item level

    Hi
    I am making downpayment to vendor. I am using new functionality in EHP 4 for making downpayment through purchase order. When I run ME2DP for making downpayment, it asks for PO item number also.
    But my requirement is to pay downpayment at PO header level and not PO item level.
    Please let me know how can I do payment at PO header level.

    Dear All,
    I am also facing the same problem. Though Advance is maintained in PO Header Level when we are trying to create  DP Request / DP system is asking line item mandatory. Suppose i have 2 line items  and maintain all the DP amount in first line item. I am making service entry for 2nd line item and bill for the second line item. Then in this case, the Down Payment clearing tab doesn't appear in MIRO. That means I am unable to adjust the advance for the line item though I have to adjust it.
    Problem is that there are more than 300 lines items in PO/WO. Hence it becomes difficult to maintain advance for each line or the user is forced to maintain certain amount in line item..
    Please help ..Its urgent
    Regards,
    Ganesh

  • Levels and pre-fader metering

    In the past I had always presumed the channel fader controlled the volume level of a softsynth. I know now that it obviously does not. It simply acts as a "faucet," so to speak. Meaning that increasing the fader level simply allows more of the synth's signal to pass.
    So my question is this: Understanding that it is the instrument itself which is actually generating the signal (and consequetly the db's) should I keep all my faders at 0.00 on my audio instrument tracks and adjust the instrument levels instead when I mix? It would seem to make sense.
    And in keeping with this method, as far as automation is concerned, would I be correct in assuming track volume should be automated via the instrument level?
    I'm trying to keep all of this in the context of how it would be done in a studio with "real" instruments. Where the volume level is the volume level and if you want it louder you play harder or you turn up your amp. You want it softer you play your instrument softer.
    I should add that this whole question/conundrum came up after I started recording with pre fader metering enabled and was quite surprised to see just how easy it is to clip a track (remembering of course that as long as no clipping is occuring at the master output I'm NOT actually clipping).

    I don't know where all this "fader fear" is coming from.
    In DSP terms, making something louder or quieter is simply a really simple DSP calculation. It doesn't matter whether you turn you softsynth down 3dB, or turn the fader down 3dB, the end result is exactly the same.
    The faders are there for easy controlling of levels. It's why mixers were invented and designed this way.
    I think these days someone reads a post on some esoteric audio forum about how their mix was so much better when they left the fader at 0dB in some obscure DAW back in the 90's, and translate that into "I must never use faders" or "I must never EQ" or "I must effectively work out my mix beforehand by my mic choice and positioning so all the signals magically combine into an artistic mix."
    I'm trying to keep all of this in the context of how it would be done
    in a studio with "real" instruments. Where the volume level is the
    volume level and if you want it louder you play harder or you turn
    up your amp.
    You completely lost me here. Learning about gain staging is Elementary Audio Engineering Class 2 (the one afer the "what is a signal", and "what is a transducer").
    You set the level of your recording device or mixer's input depending on the item you are recording, so the natural sound of whatever your source is corresponds to some nominal level for your recorder. It doesn't just magically happen on its own.
    If you're recording an instrument that doesn't put out much sound level, you use the mic preamp to boost the signal to an appropriate recording level. And if you're recording Concorde taking off, you adjust your preamp, probably turning it down or padding the signal, again so it fits the range of your recorder.
    Softsynths can put out a lot of level, and many preset designers don't pay much attention to volume levels in their patches. In addition, when you play a patch polyphonically, you're mixing together multiple signals (each of the notes you are playing) so you end up with higher levels still. It's good practice to pull down the output of the synth if it's too hot, but generally speaking, and avoiding getting into any "levels in Logic" complexities here, using the fader in Logic is exactly the same process.
    Faders don't bite. Use 'em.

  • How to split presentation level and business level using two ATG instances

    Hello All!
    We are investigating possibility of splitting ATG presentation (web store with jsp pages and other presentation components) and business (ATG components such as Pricing, Catalog, etc.) levels. The first idea that we have is simply start two instances of ATG. One instance will serve client requests (presentation level) and communicates with another ATG instance (business level) where all ATG components are situated.
    The main problem is a Nucleus container which is used for accessing all ATG components. And we don't know right solution how to point to a Nucleus container that is situated on a remote ATG instance. Right now we have two ideas how to establish communication between two ATG instances:
    - try to replace local Nucleus container by remote one using RMI;
    - do not replace Nucleus container by implementing some custom filter that can redirect all servlet requests to another ATG instance. In that case we will have two Nucleus containers.
    What do you think about all this? Do you have any other solutions how to solve the task? Maybe we lost something? Can we deploy a cluster of ATG instances that will communicate between each other?
    Thanks in advance.
    Andrey.
    Edited by: 945758 on Jul 11, 2012 7:00 AM

    Yes ATG system can have multiple nodes grouped in one or more clusters managed by load balencer
    If the services you are talking about are inherently ATG services like login, add to cart, checkout then its better to implement it with ATG.
    ATG provides and support both REST and SOAP based Webservice and allows you to expose any ATG component as service thus making it available outside ATG space.
    To be able to manage load better you can split your servers to page serving servers and services oriented servers and place them into multiple clusters.
    Though I haven't seen anyone using this kind of topology so not sure whether it's there is any challenge in setting up this topology.

  • Want to upgrade 2008 R2 to 2012. Upgrade advisor giving errors relating to Database Compatability Level and Server Collation

    I want to upgrade a SQL Server from 2008 R2 to 2012.  When I run upgrade advisor, I get the following error messages:
    Rule "Valid Database compatibility level and successful connection" failed.
    The report server database is not a supported compatibility level or a connection cannot be established.  Use Reporting Services Configuration Manager to verify the report server configuration and SQL Server management tools to verify the compatibility
    level.
    Rule "Valid Database server collation and successful connection" failed.
    The SQL Server Database Engine is not configured with a valid server collation and cannot be used as the Reporting Services SharePoint Shared Service catalog database.
    The database called ReportServer has Collation = Latin1_General_Cl_AS_KS_WS and Compatibility level = 100

    Hi Andrew,
    Regarding to the first error message, please check the Reporting Service Configuration Manager, make sure that you connect to ReportServer database from the correct server
     and use servername\instancename format connection string. For more details, please review this similar
    thread.
    Regarding to the second error, it is caused by that the current SQL Server Database Engine server is using an incompatible server collation.
    SQL Server 2012 Reporting Services SharePoint mode utilizes the SharePoint shared services architecture. SharePoint does not support SQL Server Database Engine configured for case sensitive or server collations or binary server collations.
    And since the SQL Server Database Engine server collation property cannot be changed, you will not be able to complete an upgrade of Reporting Services. You will need to migrate your Reporting Services installation to a new server which is using a compatible
    server collation. For more details, please review the following article:
    Incompatible Database Engine Server Collation:
    https://msdn.microsoft.com/en-us/library/hh759335%28v=sql.110%29.aspx?f=255&MSPPError=-2147217396
    Thanks,
    Lydia Zhang
    Lydia Zhang
    TechNet Community Support

  • Schema level and table level supplemental logging

    Hello,
    I'm setting up bi- directional DML replication between two oracle databases. I have enabled supplemental logging database level by running this command-
    SQL>alter database add supplemental log data (primary key) columns;
    Database altered.
    SQL> select SUPPLEMENTAL_LOG_DATA_MIN, SUPPLEMENTAL_LOG_DATA_PK, SUPPLEMENTAL_LOG_DATA_UI from v$database;
    SUPPLEME SUP SUP
    IMPLICIT YES NO
    -My question is should I enable supplemental logging table level also(for DML replication only)? should I run the below command also?
    GGSCI (db1) 1> DBLOGIN USERID ggs_admin, PASSWORD ggs_admin
    Successfully logged into database.
    GGSCI (db1) 2> ADD TRANDATA schema.<table-name>
    what is the deference between schema level and table level supplemental logging?

    For Oracle, ADD TRANDATA by default enables table-level supplemental logging. The supplemental log group includes one of the following sets of columns, in the listed order of priority, depending on what is defined on the table:
    1. Primary key
    2. First unique key alphanumerically with no virtual columns, no UDTs, no functionbased
    columns, and no nullable columns
    3. First unique key alphanumerically with no virtual columns, no UDTs, or no functionbased
    columns, but can include nullable columns
    4. If none of the preceding key types exist (even though there might be other types of keys
    defined on the table) Oracle GoldenGate constructs a pseudo key of all columns that
    the database allows to be used in a unique key, excluding virtual columns, UDTs,
    function-based columns, and any columns that are explicitly excluded from the Oracle
    GoldenGate configuration.
    The command issues an ALTER TABLE command with an ADD SUPPLEMENTAL LOG DATA clause that
    is appropriate for the type of unique constraint (or lack of one) that is defined for the table.
    When to use ADD TRANDATA for an Oracle source database
    Use ADD TRANDATA only if you are not using the Oracle GoldenGate DDL replication feature.
    If you are using the Oracle GoldenGate DDL replication feature, use the ADD SCHEMATRANDATA command to log the required supplemental data. It is possible to use ADD
    TRANDATA when DDL support is enabled, but only if you can guarantee one of the following:
    ● You can stop DML activity on any and all tables before users or applications perform DDL on them.
    ● You cannot stop DML activity before the DDL occurs, but you can guarantee that:
    ❍ There is no possibility that users or applications will issue DDL that adds new tables whose names satisfy an explicit or wildcarded specification in a TABLE or MAP
    statement.
    ❍ There is no possibility that users or applications will issue DDL that changes the key definitions of any tables that are already in the Oracle GoldenGate configuration.
    ADD SCHEMATRANDATA ensures replication continuity should DML ever occur on an object for which DDL has just been performed.
    You can use ADD TRANDATA even when using ADD SCHEMATRANDATA if you need to use the COLS option to log any non-key columns, such as those needed for FILTER statements and KEYCOLS clauses in the TABLE and MAP parameters.
    Additional requirements when using ADD TRANDATA
    Besides table-level logging, minimal supplemental logging must be enabled at the database level in order for Oracle GoldenGate to process updates to primary keys and
    chained rows. This must be done through the database interface, not through Oracle GoldenGate. You can enable minimal supplemental logging by issuing the following DDL
    statement:
    SQL> alter database add supplemental log data;
    To verify that supplemental logging is enabled at the database level, issue the following statement:
    SELECT SUPPLEMENTAL_LOG_DATA_MIN FROM V$DATABASE;
    The output of the query must be YES or IMPLICIT. LOG_DATA_MIN must be explicitly set, because it is not enabled automatically when other LOG_DATA options are set.
    If you required more details refer Oracle® GoldenGate Windows and UNIX Reference Guide 11g Release 2 (11.2.1.0.0)

  • Dimensions, Levels and Keys

    Folks,
    I am trying to deploy and load a dimension table composed of the following columns:
    Company Id
    Company_Name
    Corporation_Id
    Corporation_Name
    Cost_Center_Id
    Cost_Center_Name
    I have created three levels: Company, Corporation and Cost Center. Each Level is composed of the corresponding id and name.
    I have also created a single hierarchy, composed of the three levels: Company, Corporation and Cost Center.
    When I deploy the script, it creates a Unique Key constraint composed of the key at the lowest level - Cost_Center_Id.
    However, it is the combination of Company_Id, Corporation_Id and Cost_Center_Id that is required to uniquley identify a row in the dimension table.
    Any clue as to what I am doing wrong?
    Thanks. Bruce

    Hi,
    I believe hierarchies in OWB are One to Many. In your example, a company can have many corporations and a corporation can have many cost centers, but a cost center can only belong to one corporation and a corporation can only belong to one company. If this was the case, then Cost_Centre_id would be the unique key. You're saying that the combination of Company_Id, Corporation_Id and Cost_Center_Id is the unique key, and this leads me to believe that you don't actually have a hierarchy (the way OWB sees it) One way to handle this is to put all the columns in a single level, and add a unique identifier for each possible combination of the three.
    If you look at "best practices" I think you'll find that you should always use a single synthetic numeric value as your primary key in a dimension table. This ensures the most efficient join to the fact table. (If you have a three column key, you need all three columns in the fact table, wasting space and creating a more complex join) Also, you should try to avoid using any keys from an OLTP system as a key in your dimension.
    Hope this helps,
    Roald

  • Can a CR work only for one aggregation level and not other ?

    Hi Experts,
    I have a real time infocube (RTIC) over which there is a multiprovider, on top of which I have built 2 aggregation levels.
    There are sequence of 2 characteristic relationships (CR's based on exit class) defined for the same RTIC.
    Is it possible that the CR's be functional for one aggregation level and not in case of second aggregation level ?
    Thanks and Regards,
    Pankaj

    Additionally let me explain the scenario in more detail:
    The difference between 2 mentioned aggr levels is that of 0CALMONTH which is checked in second aggr level.
    0CALMONTH is assigned as target char in first CR, hence both the CRu2019s work for input ready report built on first aggregation level and do not function with input ready report built over second aggregation level.
    Therefore I wish to skip the complete execution of CRu2019s for I-R reports based on 2nd aggregation level.
    As CR of type with derivation is functional only if the target char selected is not present in the aggregation level.
    Please let me know if any more details are required.
    Thanks,
    Pankaj

  • Fader levels and mixing

    Coming from years of analogue recording I've been used to riding fader levels quite high (slamming the tape, etc.) However, I've noticed that in the DAW world this is actaully NOT such good practice!
    I was wondering, though, if it was typical to have fader levels at such low levels. Sometimes my levels are at -18db or more!!? However, everything sounds good and the final mix comes out fine with some mastering tools. But I have found that lower fader levels allows for better mixing..its just odd, for me, to see such small scaling in the tracks' VU meters!
    Am I doing something incorrectly here. Is there some kind of calibration that I am not aware of that raises the amount of fader play?
    Thanks.

    Clem et al,
    This information was helpful as I too want to record in the -12 -18 range for tracking, but I'm struggling a bit with gain-staging, etc at the Tracking, Mixing and Mastering stages. My goal is to record as cleanly as possible and maintain headroom as I move between these three stages to a finished product. If you know of a good website that covers these fundamentals that would be also be helpful.
    Here's what I'm currently doing on a typical song (I'm doing this all myself due to current budget)
    TRACKING STAGE: Record each audio part (typically vocals, guitar & bass) at -12 to -18 in 24 bit / 44.1 kHz (44k to save disk space...I can do 96 on my Apogee Duet if that offers significant benefits. Note that this is the *recording level* - not where the vocal/bass/guitar sits after pulling back the track fader).
    - Issues at this stage: When I record at this level I have to crank my Duet headphone output up to 100% and it still is occasionally hard to hear the part vs the backing material I've already recorded. When I track at a higher level and then pull back the fader for the mix this is not a problem.
    - My Summit 2BA-221 pre appears to be barely working to get this -12 to -18 in Logic arrange window.
    - Main question: What should be corrected in the above? Should I record the track at a higher level to "work" the mic/pre and add more tube "warmth" etc. or go some other route. If I track at this level does that match up more closely with Logic's default compressor/limiter, etc. plug settings? Also I notice that if I use an aux/send for drums, etc. that the level coming from that aux can also add up after a enough tracks are sent to it.
    - I want strong vocals, but don't want to then hear the backing track drop (due to compresssion) when the vocal levels jump higher.
    MIXING STAGE: I have heard from more experienced musicians/engineers on other websites that I should mix down (bounce) to a stereo track prior to mastering and keep this to -3 -4 prior to mastering. Is this target level correct. Should I be using a buss/mix compressor here for "glue" and/or EQ here or wait until final Mastering stage?
    MASTERING STAGE: Currently I am using Ozone 4 for this...any basic guidelines or pointers you can offer are very much appreciated.
    I realize that much of this is subjective and an art, but I'm looking for basic building blocks that I can work from...to that end any related info you can provide is very much appreciated. I realize you could respond with a book to the above but I'm primarily interested in correcting my gain-staging approach as related to the original tracking level discussion here.
    Thanks in advance for any assistance you can offer!

  • How to create & Assign unique Qualification level and Groups(Each trans.)

    Hi,
    While working on CRM_LEAD i have observed that i cannot create unique Qualification level or Groups/origins for each transactions, if i add one more Qualification level or Groups/origins in existing table it is reflecting in all corroseponding Leads.
    Please tell me how to create seperate Qualification level and Groups/origins for each Transaction, else we can use Authorisation Object for this.
    Please resolve.
    Regurds,
    Dipesh.

    Hi Chirag,
    I've read the following link, but the question is....
    - Where is the link to create transport groups? That is my only question It's as easy as that
    Found it...sorry I knew it was easy
    Edited by: Gonçalo Mouro Vaz on Dec 20, 2007 5:13 PM

Maybe you are looking for

  • How to make macbook pro run faster

    Hi, Saw a post from admins I guess, listed below that post and my Macbook Pro details as required: Do a Apple Menu > About This Mac > More information and copy and paste here what you see there EXCEPT the serial numbers at the bottom. Edit the serial

  • System is considering Freight during Excise.

    Hi SAP Gurus, We have one scenario where the  vendor is maintained  as supply as well as freight vendor. While doing PO for this vendor, in conditions it is mentioned that excise amount is added to inventory as cenvat cannot be availed( no set off).

  • Getting the servlet alias programatically

    Does anyone know if there is a way to specify the servlet's alias in the same way that one can find the context with request.getContextPath()? It seems a pity to have to hard code the alias name? Thanks.

  • Maximum size of PSA

    What is the maximum size of PSA ? How many records could it accomodate at maximum ?

  • SAP GUI 720 (patch level 6) - Date field problem

    Hello all, in selection screens (like transaction IW38) some users use the F4 help and the enter key to select a date (like field "Period"): they jump with the tab key to the field, delete the current entry and then press F4 and hit the enter key in