J2IUN is of less amount when compared to their payable GL's of BED, ECS a

Dear gurus,
                   For the month J2IUN amount less when compared to their payable GL's of BED, ECS.SECS.what is the resons for this and
Possible solution for the above.
Thanks
Venky

Correct, there is really no plan they offer that is uncapped just like they told you.  The cell network data solutions are not meant to be used for home internet but to be something that is used occasionally for someone who needs mobile internet.  And right again, congestion is one of the biggest problems LTE has caused for VZW Nashville, DC, Portland, Silicon Valley...

Similar Messages

  • I do not have many messages but my iphone 5s says messages take up 4.6GB of space.  When comparing with friends who have many more message, less than 1 GB is being used on their phones.  Why are mine taking up so much space and how do I correct that?

    I do not have that many messages on my new iPhone 5s (maybe about 15 conversations and only one or two going back a little ways) yet when I go to Settings, Usage it shows that my messages are taking up 4.6GB of space.  When comparing with others (some who have hundeds of messages) they are using under 1GB for their messages.  Why are mine taking up so much space??

    I found a post with this solution that worked for me - .Do a backup, then download ibackupbot. It lets you look into the contents of that backup. Look in system files>mediadomain>library>sms>attachments. I found a ton of huge files related to long deleted texts. ibackupbot let me delete all that old crap. I then restored the phone with that now modifec backup and Bingo! freed up 9 GB. No problems.
    But note, if you are not technically inclined you probaly don't want to be digging around and deleting back up files. You delete the wrong thing and you could jack up your phone badly. Be careful.

  • ODI Supports ELT technology..How it is best when compared to ETL?

    In any Convensional ETL tool we are able to perform complex logics very easyly by using the componets inside the tool.
    I have not found any such components or transformations in ODI.
    But when I am reading the documents it says that ODI supports complex logics?
    Then How it is best when compared to any ETL tool in the market?

    Hi Harmeet,
    How are you?
    Seems like you are in fire...:-)
    Well, yes ODI is an E-LT tool and i guess this is one and only tool in market which follow E-LT architecture.
    Coming to comparison,
    An ETL tool needs three servers to move the source data to target means Source System, Transformation Engine and Data warehouse.( data is transformed twice in this approach)
    In E-LT, transformation engine/server and data warehouse server is combined as one and needs only 2 servers to do transforamtion. Because of this cost is very less and speed is very good.
    This is the one of the big advantage of using E-LT architecture.
    Experts comments are welcome...:-)
    Thanks,
    Guru

  • Data in the cube is showing multiple entries when compared with ODS

    Hello BW Gurus,
    We have a waste report in production planning on Cube and ODS separately. The same info package loads both targets (which means same infosource) but when we run a report on Cube, the records are showing multiple entries (i.e. Key Figures are not matching when compared to ODS) where as the ODS records are showing correctly as it was in R/3. There are totally 6 key figures out of which 4 pulled from R/3 and 2 are populated in BW.
    An Example:
    Waste report in PP run for plant 1000 for 12/2005 and process order 123456. The operational scrap should be 2.46% and the component scrap should be 3.00% for material 10000000. The report is showing 7.87% for planned operational waste % and 9.6% for planned component waste %. These values are not correct. The ODS values for order 123456 matched the data in R/3 for component and operational scrap.
    There is a Start routine to the ODS and also to the cube. I am not good at ABAP so requesting your Help.
    Here is the ODS Code:
    tables: /BI0/PPRODORDER.
    loop at data_package.
    select single COORD_TYPE
    PRODVERS
    into (/BI0/PPRODORDER-COORD_TYPE,
    /BI0/PPRODORDER-PRODVERS)
    from /BI0/PPRODORDER
    where PRODORDER = data_package-PRODORDER
    and OBJVERS = 'A'.
    if sy-subrc = 0.
    if /BI0/PPRODORDER-COORD_TYPE = 'XXXX'
    or /BI0/PPRODORDER-COORD_TYPE = 'YYYY'.
    data_package-PRODVERS = space.
    else.
    data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
    endif.
    endif.
    if data_package-calday = space
    or data_package-calday = '00000000'.
    if data_package-TGTCONSQTY NE 0.
    data_package-calday = data_package-ACTRELDATE.
    endif.
    endif.
    modify data_package.
    endloop.
    Here is Cube Code:
    tables: /BI0/PPRODORDER,
    /BIC/ODS.
    TYPES:
    BEGIN OF ys_mat_unit,
    material TYPE /bi0/oimaterial,
    mat_unit TYPE /bi0/oimat_unit,
    numerator TYPE /bi0/oinumerator,
    denomintr TYPE /bi0/oidenomintr,
    END OF ys_mat_unit.
    DATA:
    l_s_mat_unit TYPE ys_mat_unit,
    e_factor type p decimals 5.
    loop at data_package.
    select single COORD_TYPE
    PRODVERS
    into (/BI0/PPRODORDER-COORD_TYPE,
    /BI0/PPRODORDER-PRODVERS)
    from /BI0/PPRODORDER
    where PRODORDER = data_package-PRODORDER
    and OBJVERS = 'A'.
    if sy-subrc = 0.
    if /BI0/PPRODORDER-COORD_TYPE = 'XXX'
    or /BI0/PPRODORDER-COORD_TYPE = 'YYY'.
    data_package-PRODVERS = space.
    else.
    data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
    endif.
    endif.
    if data_package-calday = space
    or data_package-calday = '00000000'.
    if data_package-TGTCONSQTY NE 0.
    data_package-calday = data_package-ACTRELDATE.
    endif.
    endif.
    data_package-agsu = 'GSU'.
    data_package-agsu_qty = 0.
    select single gr_qty
    base_uom
    into (/BIC/ODS-gr_qty,
    /BIC/ODS-base_uom)
    from /BIC/ODS
    where prodorder = data_package-prodorder
    and material = data_package-material.
    if sy-subrc = 0.
    if /BIC/ODS-base_uom = 'GSU'.
    data_package-agsu_qty = /BIC/ODS-gr_qty.
    else.
    SELECT SINGLE * FROM /bi0/pmat_unit
    INTO CORRESPONDING FIELDS OF l_s_mat_unit
    WHERE material = data_package-material
    AND mat_unit = 'GSU'
    AND objvers = 'A'.
    IF sy-subrc = 0.
    IF l_s_mat_unit-denomintr <> 0.
    e_factor = l_s_mat_unit-denomintr /
    l_s_mat_unit-numerator.
    multiply /BIC/ODS-gr_qty by e_factor.
    data_package-agsu_qty = /BIC/ODS-gr_qty.
    ENDIF.
    else.
    CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
    EXPORTING
    INPUT = /BIC/ODS-gr_qty
    NO_TYPE_CHECK = 'X'
    ROUND_SIGN = ' '
    UNIT_IN = /BIC/ODS-base_uom
    UNIT_OUT = 'GSU'
    IMPORTING
    OUTPUT = DATA_PACKAGE-gsu_qty
    EXCEPTIONS
    CONVERSION_NOT_FOUND = 1
    DIVISION_BY_ZERO = 2
    INPUT_INVALID = 3
    OUTPUT_INVALID = 4
    OVERFLOW = 5
    TYPE_INVALID = 6
    UNITS_MISSING = 7
    UNIT_IN_NOT_FOUND = 8
    UNIT_OUT_NOT_FOUND = 9
    OTHERS = 10.
    endif.
    endif.
    endif.
    modify data_package.
    endloop.
    some how the AGSU qyt is not populating in the cube and when I dbug the code, I could see a clean record in the internal table but not in the cube.
    your suggestion and solutions would be highly appreciated.
    thanks,
    Swathi.

    Hi Swathi
    In ODs we have option of overwriting and addition however in Cube we have only adition.Thats why you are getting multiple enteries.
    If you are running daily full load on the cube then please delete the earlier requests.
    So at one point of time there should be only one full load request in cube. Hope this will solve your problem.
    Regards,
    Monika

  • Data in the cube is showing wrong when compared with ODS

    Hello BW Gurus,
    We have a waste report in production planning on Cube and ODS separately. The same info package loads both targets (which means same infosource) but when we run a report on Cube, the records are showing multiple entries (i.e. Key Figures are not matching when compared to ODS) where as the ODS records are showing correctly as it was in R/3. There are totally 6 key figures out of which 4 pulled from R/3 and 2 are populated in BW. 
    An Example:
    Waste report in PP run for plant 1000 for 12/2005 and process order 123456.  The operational scrap should be 2.46% and the component scrap should be 3.00% for material 10000000.  The report is showing 7.87% for planned operational waste % and 9.6% for planned component waste %.  These values are not correct.  The ODS values for order 123456 matched the data in R/3 for component and operational scrap.
    There is a Start routine to the ODS and also to the cube. I am not good at ABAP so requesting your Help.
    <b>Here is the ODS Code:</b>
    tables:  /BI0/PPRODORDER.
      loop at data_package.
        select single COORD_TYPE
                      PRODVERS
          into (/BI0/PPRODORDER-COORD_TYPE,
                /BI0/PPRODORDER-PRODVERS)
          from /BI0/PPRODORDER
         where PRODORDER = data_package-PRODORDER
           and OBJVERS   = 'A'.
        if sy-subrc = 0.
          if /BI0/PPRODORDER-COORD_TYPE = 'XXXX'
          or /BI0/PPRODORDER-COORD_TYPE = 'YYYY'.
            data_package-PRODVERS = space.
          else.
            data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
          endif.
        endif.
        if data_package-calday = space
        or data_package-calday = '00000000'.
          if data_package-TGTCONSQTY NE 0.
            data_package-calday = data_package-ACTRELDATE.
          endif.
        endif.
        modify data_package.
      endloop.
    <b>Here is Cube Code:</b>
    tables:  /BI0/PPRODORDER,
               /BIC/ODS.
      TYPES:
      BEGIN OF ys_mat_unit,
        material                 TYPE /bi0/oimaterial,
        mat_unit                 TYPE /bi0/oimat_unit,
        numerator                TYPE /bi0/oinumerator,
        denomintr                TYPE /bi0/oidenomintr,
      END OF ys_mat_unit.
      DATA:
        l_s_mat_unit             TYPE ys_mat_unit,
        e_factor                 type p decimals 5.
      loop at data_package.
        select single COORD_TYPE
                      PRODVERS
          into (/BI0/PPRODORDER-COORD_TYPE,
                /BI0/PPRODORDER-PRODVERS)
          from /BI0/PPRODORDER
         where PRODORDER = data_package-PRODORDER
           and OBJVERS   = 'A'.
        if sy-subrc = 0.
          if /BI0/PPRODORDER-COORD_TYPE = 'XXX'
          or /BI0/PPRODORDER-COORD_TYPE = 'YYY'.
            data_package-PRODVERS = space.
          else.
            data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
          endif.
        endif.
        if data_package-calday = space
        or data_package-calday = '00000000'.
          if data_package-TGTCONSQTY NE 0.
            data_package-calday = data_package-ACTRELDATE.
          endif.
        endif.
        data_package-agsu     = 'GSU'.
        data_package-agsu_qty = 0.
        select single gr_qty
                      base_uom
          into (/BIC/ODS-gr_qty,
                /BIC/ODS-base_uom)
          from /BIC/ODS
         where prodorder = data_package-prodorder
           and material  = data_package-material.
        if sy-subrc = 0.
          if /BIC/ODS-base_uom = 'GSU'.
            data_package-agsu_qty = /BIC/ODS-gr_qty.
          else.
            SELECT SINGLE * FROM /bi0/pmat_unit
              INTO CORRESPONDING FIELDS OF l_s_mat_unit
              WHERE material   = data_package-material
                AND mat_unit   = 'GSU'
                AND objvers    = 'A'.
            IF sy-subrc = 0.
              IF l_s_mat_unit-denomintr <> 0.
                e_factor = l_s_mat_unit-denomintr /  
                              l_s_mat_unit-numerator.
                multiply /BIC/ODS-gr_qty by e_factor.
                data_package-agsu_qty = /BIC/ODS-gr_qty.
              ENDIF.
            else.
              CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
                EXPORTING
                  INPUT                = /BIC/ODS-gr_qty
                  NO_TYPE_CHECK        = 'X'
                  ROUND_SIGN           = ' '
                  UNIT_IN              = /BIC/ODS-base_uom
                  UNIT_OUT             = 'GSU'
                IMPORTING
                  OUTPUT               = DATA_PACKAGE-gsu_qty
                EXCEPTIONS
                  CONVERSION_NOT_FOUND = 1
                  DIVISION_BY_ZERO     = 2
                  INPUT_INVALID        = 3
                  OUTPUT_INVALID       = 4
                  OVERFLOW             = 5
                  TYPE_INVALID         = 6
                  UNITS_MISSING        = 7
                  UNIT_IN_NOT_FOUND    = 8
                  UNIT_OUT_NOT_FOUND   = 9
                  OTHERS               = 10.
            endif.
          endif.
        endif.
        modify data_package.
      endloop.
    some how the AGSU qyt is not populating in the cube and when I dbug the code, I could see a clean record in the internal table but not in the cube.
    your suggestion and solutions would be highly appreciated.
    thanks,
    Swathi.

    Hi Swathi,
    May be you might want to look into the way the % is being calculated in the cube. If the formula involves counting the no. of records, then you will also be counting the -ve records that are posted in the cube unless you have had a compression on the cube. that might give you wrong numbers.
    Doniv

  • HI Team, Recently One week back I bought a new iphone 5 from India. They have given me a used mobile which has a different IMEI number when compared to the IMEI number present on the box. Please let me know how to proceed further

    HI Team, Recently One week back I bought a new iphone 5 from India. They have given me a used mobile which has a different IMEI number when compared to the IMEI number present on the box. Please let me know how to proceed further

    When you went back to the place where you purchased this phone, & asked them, what did they say?
    No one here can help you with this, nor can/will Apple. You need to take this up with whoever you purchased this phone from.

  • I can't seem to get individual elements when comparing 2 arrays using Compare-Object

    My backup software keeps track of servers with issues using a 30 day rolling log, which it emails to me once a week in CSV format. What I want to do is create a master list of servers, then compare that master list against the new weekly lists to identify
    servers that are not in the master list, and vice versa. That way I know what servers are new problem and which ones are pre-existing and which ones dropped off the master list. At the bottom is the entire code for the project. I know it's a bit much
    but I want to provide all the information, hopefully making it easier for you to help me :)
    Right now the part I am working on is in the Compare-NewAgainstMaster function, beginning on line 93. After putting one more (fake) server in the master file, the output I get looks like this
    Total entries (arrMasterServers): 245
    Total entries (arrNewServers): 244
    Comparing new against master
    There are 1 differences.
    InputObject SideIndicator
    @{Agent= Virtual Server in vCenterServer; Backupse... <=
    What I am trying to get is just the name of the server, which should be $arrDifferent[0] or possibly $arrDifferent.Client. Once I have the name(s) of the servers that are different, then I can do stuff with that. So either I am not accessing the array
    right, building the array right, or using Compare-Object correctly.
    Thank you!
    Sample opening lines from the report
    " CommCells > myComCellServer (Reports) >"
    " myComCellServer -"
    " 30 day SLA"
    CommCell Details
    " Client"," Agent"," Instance"," Backupset"," Subclient"," Reason"," Last Job Id"," Last Job End"," Last Job Status"
    " myServerA"," vCenterServer"," VMware"," defaultBackupSet"," default"," No Job within SLA Period"," 496223"," Nov 17, 2014"," Killed"
    " myServerB"," Oracle Database"," myDataBase"," default"," default"," No Job within SLA Period"," 0"," N/A"," N/A"
    Entire script
    # things to add
    # what date was server entered in list
    # how many days has server been on list
    # add temp.status = pre-existing, new, removed from list
    # copy sla_master before making changes. Copy to archive folder, automate rolling 90 days?
    ## 20150114 Created script ##
    #declare global variables
    $global:arrNewServers = @()
    $global:arrMasterServers = @()
    $global:countNewServers = 1
    function Get-NewServers
    Param($path)
    Write-Host "Since we're skipping the 1st 6 lines, create test to check for opening lines of report from CommVault."
    write-host "If not original report, break out of script"
    Write-Host ""
    #skip 5 to include headers, 6 for no headers
    (Get-Content -path $path | Select-Object -Skip 6) | Set-Content $path
    $sourceNewServers = get-content -path $path
    $global:countNewServers = 1
    foreach ($line in $sourceNewServers)
    #declare array to hold object temporarily
    $temp = @{}
    $tempLine = $line.Split(",")
    #get and assign values
    $temp.Client = $tempLine[0].Substring(2, $tempLine[0].Length-3)
    $temp.Agent = $tempLine[1].Substring(2, $tempLine[1].Length-3)
    $temp.Backupset = $tempLine[3].Substring(2, $tempLine[3].Length-3)
    $temp.Reason = $tempLine[5].Substring(2, $tempLine[5].Length-3)
    #write temp object to array
    $global:arrNewServers += New-Object -TypeName psobject -Property $temp
    #increment counter
    $global:countNewServers ++
    Write-Host ""
    $exportYN = Read-Host "Do you want to export new servers to new master list?"
    $exportYN = $exportYN.ToUpper()
    if ($exportYN -eq "Y")
    $exportPath = Read-Host "Enter full path to export to"
    Write-Host "Exporting to $($exportPath)"
    foreach ($server in $arrNewServers)
    $newtext = $Server.Client + ", " + $Server.Agent + ", " + $Server.Backupset + ", " + $Server.Reason
    Add-Content -Path $exportPath -Value $newtext
    function Get-MasterServers
    Param($path)
    $sourceMaster = get-content -path $path
    $global:countMasterServers = 1
    foreach ($line in $sourceMaster)
    #declare array to hold object temporarily
    $temp = @{}
    $tempLine = $line.Split(",")
    #get and assign values
    $temp.Client = $tempLine[0]
    $temp.Agent = $tempLine[1]
    $temp.Backupset = $tempLine[2]
    $temp.Reason = $tempLine[3]
    #write temp object to array
    $global:arrMasterServers += New-Object -TypeName psobject -Property $temp
    #increment counter
    $global:countMasterServers ++
    function Compare-NewAgainstMaster
    Write-Host "Total entries (arrMasterServers): $($countMasterServers)"
    Write-Host "Total entries (arrNewServers): $($countNewServers)"
    Write-Host "Comparing new against master"
    #Compare-Object $arrMasterServers $arrNewServers
    $arrDifferent = @(Compare-Object $arrMasterServers $arrNewServers)
    Write-Host "There are $($arrDifferent.Count) differences."
    foreach ($item in $arrDifferent)
    $item
    ## BEGIN CODE ##
    cls
    $getMasterServersYN = Read-Host "Do you want to get master servers?"
    $getMasterServersYN = $getMasterServersYN.ToUpper()
    if ($getMasterServersYN -eq "Y")
    $filePathMaster = Read-Host "Enter full path and file name to master server list"
    $temp = Test-Path $filePathMaster
    if ($temp -eq $false)
    Read-Host "File not found ($($filePathMaster)), press any key to exit script"
    exit
    Get-MasterServers -path $filePathMaster
    $getNewServersYN = Read-Host "Do you want to get new servers?"
    $getNewServersYN = $getNewServersYN.ToUpper()
    if ($getNewServersYN -eq "Y")
    $filePathNewServers = Read-Host "Enter full path and file name to new server list"
    $temp = Test-Path $filePathNewServers
    if ($temp -eq $false)
    Read-Host "File not found ($($filePath)), press any key to exit script"
    exit
    Get-NewServers -path $filePathNewServers
    #$global:arrNewServers | format-table client, agent, backupset, reason -AutoSize
    #Write-Host ""
    #Write-Host "Total entries (arrNewServers): $($countNewServers)"
    #Write-Host ""
    #$global:arrMasterServers | format-table client, agent, backupset, reason -AutoSize
    #Write-Host ""
    #Write-Host "Total entries (arrMasterServers): $($countMasterServers)"
    #Write-Host ""
    Compare-NewAgainstMaster

    do not do this:
    $arrDifferent = @(Compare-Object $arrMasterServers $arrNewServers)
    Try this:
    $arrDifferent = Compare-Object $arrMasterServers $arrNewServers -PassThru
    ¯\_(ツ)_/¯
    This is what made the difference. I guess you don't have to declare arrDifferent as an array, it is automatically created as an array when Compare-Object runs and fills it with the results of the compare operation. I'll look at that "pass thru" option
    in a little more detail. Thank you very much!
    Yes - this is the way PowerShell works.  You do not need to write so much code once you understand what PS can and is doing.
    ¯\_(ツ)_/¯

  • Advantages of using labview in embedded application when compared to c ?

    Hello all,
    I am looking to develop embedded application with help of labview programming
    Started with Lm3s8962 evaluation board.
    So i would like to know
    1 Advantages of using labview in embedded application when compared to c ?
    2 Can we deploy the code in any kind of  controller by writting drivers for it ?

    hello, the LM3S8962 microcontroller is a good sispositivos for developing applications of medium complexity, according to your questions:
    1. if you work applications with embedded systems will save labview time in prototyping, we know that if you work with a company specific microcontroller, you must learn the language with which you are working Asm, Basic, C, then if you change hardware again start adapting your code in the libraries of the new compiler. Labview does not happen. 
    2. Labview only works with some microcontroller manufacturers and specific models.
    Atom
    Certified LabVIEW Associate Developer

  • Features available in PI 7.11 when compare with PI 7.1

    Hi Experts,
                 What are the extra features available in PI 7.11 when compare with PI 7.1. What are the monitoring techniques added in 7.11.
    Regards,

    search in sdn before posting.
    http://wiki.sdn.sap.com/wiki/display/XI/PI7.1EHP1DeltaFeaturesoverPI+7.1
    Regards,
    Raj

  • Why poor quality AIC & Apple ProRes HQ video when compared to original video files!

    First, I'm a long time PC user who has recently switched to Mac's and I'm rather picky about the quality of my videos.
    Problem: I have HD video from a Canon HF S10 camcorder and Canon EOS 7D that looks fantastic on a PC... but looks very so-so once imported into a iMovie or FCE or FCP. I've tried all three programs using AIC and yes I've even tried Apple Pro Res HQ (FCP 7.0 log & transfer) and still end up with very poor quality mov files when compared to the originals. Part of the problem (or benefit of a PC) is that the PC actually plays the original RAW MTS/M2TS/MOV files without any trouble and and they look unbelievable. Where as my Mac has to import/convert the file to AIC or Apple Pro Res HQ... so no matter what codec I use, my Mac produces video that is not even close to what I get out of my PC.
    My original RAW files are all 1920x1080 60i and are MTS files from the HF S10 and Mov files from the 7D. Even the original RAW Mov files produced from the 7D don't look as good on my Mac as they do on my PC. I even took a SD card with the original files down to the local Apple store to see how they look on a Macbook Pro and they still don't look like the originals on my PC. Also, all my comparisons are being done side by side on two 24" Apple cinema displays at full size (1920x1080) that are calibrated. What am I doing wrong and why the loss in quality? Any ideas as to why the quality is just so-so when compared to the originals would be greatly appreciated. Thanks in advance.

    The thing that is wrong is that you are judging the quality of VIDEO on COMPUTER monitors. Properly calibrated or no...this is not how you do it. Calibration for computer monitors sets them right for PHOTOSHOP and color, not VIDEO and proper monitoring. Only getting a signal to a broadcast VIDEO monitor, or HDTV, will show you what you really have. And for that you need an HD input/output device or capture card...and the HDTV or HD monitor.
    Computer monitors are never the place to judge video quality. Resolution is typically lowered to allow for full frame playback. Even if paused.
    Shane

  • Advantages of BW Query in SAP Portal when compared to WEB BEx Analyzer

    Hi
    What are the advantages of having a BW Query in SAP Portal when compared to WEB BEx Analyzer
    I had to presention to higher managers of having a SAPPortal for BW report that WEB BEx Analyzer
    can any one please update me with few points (10-20) that will be useful for my presentation
    screen to geethikakrishna->gmail
    Thanks in advance

    Hi Krishna ,
    I think you should highlight the portal role in presentation
    Now a days all organizations using sap portal as their user interface. We can login portal once and we can get all the information from bw,crm...sap systems and non sap systems.
    so no need to login each and every system individually.
    The portal offers a single point of access to SAP and non-SAP information sources, enterprise applications, information repositories, databases and services in and outside your organization—all integrated into a single user experience. It provides you the tools to manage this knowledge, to analyze and interrelate it, and to share and collaborate on the basis of it.
    With its role-based content, and personalization features, the portal enables users—from employees and customers to partners and suppliers—to focus exclusively on data relevant to daily decision-making processes
    For all information on portals refer to help.sap.com->documentation->netweaver->Netweaver2004->English
    IN people integration you will find information on portals
    http://help.sap.com/saphelp_nw04/helpdata/en/19/4554426dd13555e10000000a1550b0/frameset.htm
    check below threads for some information
    easiest way to publish
    Easiest way to post existing BW queries onto portal
    http://help.sap.com/saphelp_nw04s/helpdata/en/43/92dceb49fd25e5e10000000a1553f7/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/9d/24ff4009b8f223e10000000a155106/frameset.htm
    Execution of BW Reports Through Enterprise Portal
    Best Practices for BI Report publishing on Portal - BI 7
    Use of BI's Enterprise Portal
    The web application designer offers rich functionality for creating dashboards. In SAP NetWeaver BI 2004s (aka 7.0), when you publish a template it is automatically created as an iView for the portal.
    SAP NetWeaver 2004s BI - Define your Publishing Strategy Part 1
    Koti Reddy

  • What is the use of jsp when compare with Struts

    what is the use of jsp when compare with Struts

    JSP Tag Libraries are great for reusable content formatting and ligic.
    For example, let's say you have this Shopping site. Each item you sell is stored in a database, and you get them out depending on Catagories, creating a List of ItemBeans. You allways want to display the items with a catagory header, then a <table> with the item number, the description and the price.
    Instead of creating a bunch of logic in the JSP that does this, you can pass it on to a Tag that might look like this in your JSP:
    <shopping:itemTable catagory="${selectedCatagory}" items="${itemsForCatagory}" />
    This would make the JSP easier to read and work with.
    The actual uses are incredible. Have you used the <jsp:useBean ...> tag? That is an example of a use of the Custom Tag Libraries.
    Furthermore, look into JSTL (JSP Standard Tag Libraries). They are a collection of tags (API by Sun, coding by Apache) used to do many of the standard actions you might want/need to do in JSPs, like a conditional tag (c:if only do something if the test is true), multiple-conditional tags (c:choos c:when c:otherwise) like an if [else if] else construct. Looping through an array or Collection (c:forEach), storeing values in scopes (c:set) formating numbers and dates (the fmt library), xml transformations (xml library), and lots of other things that you could replace scriptlet code with.

  • Material Document List MB51 - Residual Amount when Quantity equals to Zero

    Dear all,
    I've this weird situation in the production system involving one material master as follows:
    Material         Description     Plant     Mvt     Posting Date     Quantity     BUn     Loc.curr.amount
    10015661     SCREW     61MM     561     30.04.2006     45     PC     2,660.40
    10015661     SCREW     61MM     101     09.03.2007     96     PC     5,177.43
    10015661     SCREW     61MM     101     24.03.2008     50     PC     3,124.75
    10015661     SCREW     61MM     101     27.02.2009     84     PC     4,483.46
    10015661     SCREW     61MM     101     28.02.2009     84     PC     4,433.87
    10015661     SCREW     61MM     101     28.02.2009     84     PC     4,312.56
    10015661     SCREW     61MM     309     07.04.2009     45-     PC     2,660.40-
    10015661     SCREW     61MM     309     07.04.2009     96-     PC     3,823.63-
    10015661     SCREW     61MM     309     07.04.2009     50-     PC     3,142.69-
    10015661     SCREW     61MM     309     07.04.2009     84-     PC     4,695.03-
    10015661     SCREW     61MM     309     07.04.2009     168-     PC     9,261.59-
                                                  Sum:      0     PC     609.13
    P/S: Sorry if the data gets messy after saving this message because I'm unable to format the columns correctly.
    If look closely, there's a residual amount when the quantity is already "ZERO". I tried to simulate in the Quality system but was unable to recreate the same results. By right, the sum amount should be "ZERO" as well.
    Would be most grateful for any explanation or advise.
    Thank you.
    Edited by: Steven Khoo on Apr 6, 2011 12:21 PM

    Hi all,
    I'm attempting to interpret the report generated by MR51 as below. I noticed there are line items with "RE" accounting document type which is stands for "Invoice - Gross". I can understand WA = Goods Issue and WE = Goods Receipt but why RE? For instance, there's a WE (Goods Receipt) of 50 PCs on 24.03.2008 and followed by a series of RE (Invoice - Gross) between 26.03.2008 and 23.04.2008. I'm confused over here. Thanks.
    Material          Material Description     CoCd     ValA
    10015661          SCREW                               1125            6110
    Type     DocumentNo     Itm     Postg Date      Quantity     BUn      Amt.in loc.cur.  Crcy
    WA     4900011305     1     07.04.2009     -168     PC     -9,261.59           RM
    WA     4900011304     1     07.04.2009     -84     PC     -4,695.03           RM
    WA     4900011303     1     07.04.2009     -50     PC     -3,142.69           RM
    WA     4900011302     1     07.04.2009     -96     PC     -3,823.63           RM
    WA     4900011301     1     07.04.2009     -45     PC     -2,660.40           RM
    RE     5100002095     796     31.03.2009     84     PC     254           RM
    RE     5100002093     280     31.03.2009     84     PC     211.57           RM
    RE     5100002089     276     31.03.2009     84     PC     261.16           RM
    WE     5000002412     789     28.02.2009     84     PC     4,312.56           RM
    WE     5000002408     271     28.02.2009     84     PC     4,433.87           RM
    WE     5000002406     275     27.02.2009     84     PC     4,483.46           RM
    RE     5100001398     2     29.04.2008     -96     PC     -1,353.80           RM
    RE     5100001385     2     23.04.2008     50     PC     0.68           RM
    RE     5100001384     2     23.04.2008     50     PC     0.68           RM
    RE     5100001383     2     23.04.2008     50     PC     2.97           RM
    RE     5100001382     2     23.04.2008     50     PC     2.31           RM
    RE     5100001341     2     26.03.2008     50     PC     11.3           RM
    WE     5000001702     1     24.03.2008     50     PC     3,124.75           RM
    WE     5000000779     1     09.03.2007     96     PC     5,177.43           RM
    WA     4900005345     1     30.04.2006     45     PC     2,660.40           RM
    P/S: Sorry, the data gets messy after saving the post.
    Edited by: Steven Khoo on Apr 25, 2011 10:11 AM

  • What are the advantage of using Oracle Database when compare to SQL SERVER

    Hi all
    Please tell anyone about
    what are the advantage of using Oracle Database when compare to SQL SERVER
    Thanks in advance
    Balamurugan S

    user12842738 wrote:
    Hi,
    There are various differences between the two.
    1. SQL Server is only Windows, but Oracle runs on almost all Platforms.
    2. You can have multiple databases in SQL Server, but Oracle provides you only one database per instance.Given that the very term 'database' has s different meaning in the two products, this "difference" is absolutely meaningless.
    3. SQL Server provides T-SQL for writing programs, whereas Oracle provides PL/SQLWhich means what? Both products have a procedural programming language. They named them differently, and the languages are not interchangeable. Means nothing in comparing the features/strengths/weaknesses/suitability to purpose.
    4. Backup types in both are the same. (Except Oracle provides an additional backup called Logical Backup.)You make that sound like "Logical Backup" is something more than it is. It is nothing more than an export of the data and metadata. Many experts don't even consider it a backup. I'm sure SQL Server provides the same functionality though they probably call it by some other name.
    5. Both provide High Availability.Well, I guess they both have a suite of features they refer to as "High Availability". But what does that really mean? The devil is in the details. Remember, the two products don't even agree on what constitutes a "database".
    6. Both come in various distributions.???
    >
    If you are going for an Implementation, you can try SQL Server Express Edition and Oracle XE which are free to use.
    Then you can choose whichever is comfortable for your needs.
    Thanks.

  • Content server pros and cons when compared to others

    Hi DMS Gurus,
    we are in a process of deciding which storage area to go for, and the client is kneen on going for a common existing  server, as he is not kneen on spending on DMS content server.
    my question are,
    can we transform any existing file server or system (which is already in use) in to a content server just by installing windows 2003, and content server CDs. if so please tell me how to go about this.
    is content server only the preferred storage to process the original files in through WEB DMS, or will Starage in SAP Database,Vault or Archive also will give this funtionality.
    let me please know what are the major Pros (benifits) and cons (demerits)  which Content server storage gives when compared to other storage.
    Points for sure,
    thanks in advance
    Shanti

    Shanti,
    ArchiveLink:
    You use SAP ArchiveLink as a communication interface between the document management system and the archiving systems. You can display the archived data using the SAP ArchiveLink viewer. Many times there are documents generated which needs to be tied to a DIR and hence you use the archive link to tie these to a transaction and the originals can be stored in the content server.
    KPRO
    KPRO apart from attaching the originals, it provides a wide range of other functionalities like content service,
    where after every check in& checkout of DIR you can mantain the working copies in the content server.
    Knowledge Provider is a component of SAP Web Application Server and provides the general infrastructure for storing and administrating documents.The Content Server and the Cache Server are server components that interact with the Knowledge Provider
    Knowledge Provider provides the following services:
    -Services for KPro client applications
    -Document Management Framework (DMF)
    -Document Management Service (DMS)
    -Content Management Service (CMS)
    -Integration of content servers and cache servers
    -Content Server
    -Allows you to integrate SAP Content Sever or
    external content servers
    -Cache Server
    -Allows you to store documents close to the client (caching)
    -Content Modeling Tool
    -Document Modeling Workbench (DMWB)
    Hope this helps
    Paddy

Maybe you are looking for