FBL5N brings no data

Hi Gurus,
I have a simple issue regarding this report.
When I executed, it started selecting many items but when the screen with the expected result was open it said that the List contains no Data. However, I am sure there is a place I can set to make it appear.
Can anybody help please?
Regards,
Roger

HI,
This problem due to the layout display variant for FBL5N. If you select layout variant 1SAP or other variant you can display line items.
Rewards me if it useful
Reagrds
Ravinagh Boni

Similar Messages

  • How to bring the data from application server to presentation server

    hi,
    i have one problem,i have written the program which will open the files in the application server when we run the program in the background(sm37),the same data from application server i want to bring into presentation server in the format of (.csv),how to bring the data from application to presentation server can any body help me on this  topic.folowing is the code .
    *& Report  ZPFA_HIER_LOAD
    REPORT  ZFPA_HIER_LOAD.
    *---- Declaration of Oracle connectioN
    DATA con_name LIKE dbcon-con_name VALUE 'COMSHARE'.
    DATA: MFL1(9),MFL2(5),MFL3(9),MFL4(2),MFL5(8) TYPE c.
    DATA : mfilename type string.
    data: begin of matab1 occurs 0,
          MFL1(9) TYPE C,
          MFL2(5) TYPE C,
          MFL3(9) TYPE C,
          MFL4(2) TYPE C,
          MFL5(8) TYPE C  ,
         end of matab1 .
    data: setid(8) type c.
    data: begin of source occurs 0,
          setid(8) type c,
          end of source.
    *PARAMETERS : p_pfile LIKE filename-FILEEXTERN.
    *PARAMETERS : m_bsenty(8). " type c obligatory.
    *mfilename = P_PFILE.
    EXEC SQL.
      SET CONNECTION :con_name
    ENDEXEC.
    EXEC SQL.
      CONNECT TO :con_name
    ENDEXEC.
    EXEC SQL PERFORMING get_source.
      SELECT set_id FROM UNIT_SET INTO
      :setid
      ORDER BY SET_ID
    ENDEXEC.
    start-of-selection.
    LOOP AT SOURCE.
      REFRESH matab1. CLEAR matab1.
      EXEC SQL PERFORMING evaluate.
    SELECT TO_CHAR(MEM_ID),TRIM(TO_CHAR(MEM_PID)) FROM UNIT_TREE INTO :MFL1,
    :MFL5
    where set_id = :SOURCE-SETID ORDER BY MEM_ID
      ENDEXEC.
      if SOURCE-SETID = '80000000'.
       mfilename = '/tmp/aesorg'.
      elseif SOURCE-SETID = '80000006'.
       mfilename = '/tmp/Consolidation_Manager'.
      elseif SOURCE-SETID = '80000010'.
       mfilename = '/tmp/10org'.
      elseif SOURCE-SETID = '80000012'.
       mfilename = '/tmp/20org'.
      elseif SOURCE-SETID = '80000018'.
       mfilename = '/tmp/30org'.
      elseif SOURCE-SETID = '80000025'.
       mfilename = '/tmp/40org'.
      Endif.
      mfilename = '/usr/test.dat'.
    ************************This was i tried***********************
      open dataset mfilename for output in text mode encoding default." IN
    *TEXT MODE ENCODING DEFAULT.
    if sy-subrc <> 0.
    exit.
    endif.
    close dataset mfilename.
    CALL FUNCTION 'GUI_DOWNLOAD'
       EXPORTING
         FILENAME         = MFILENAME
         FILETYPE         = 'ASC'
       TABLES
         data_tab         = matab1
       EXCEPTIONS
         file_write_error = 1
         invalid_type     = 2
         no_authority     = 3
         unknown_error    = 4
         OTHERS           = 10.
    loop at matab1 .
    transfer matab1 to mfilename.
    endloop.
      clear matab1.
    ENDLOOP.
    loop at matab1 .
    transfer matab1 to mfilename.
    endloop.
    close dataset mfilename.
         MFL5 = '0'.
       CLEAR MFL5.
    FORM evaluate.
      if MFL5 = -1.
        MFL5 = ''.
      ENDIF.
      concatenate MFL1 ','   into MFL1.
      concatenate MFL1 ','   into MFL3.
      matab1-MFL1 = MFL1.
      matab1-MFL2 = 'ZBUE,'.
      matab1-MFL3 = MFL3.
      matab1-MFL4 = ' ,'.
      matab1-MFL5 = MFL5.
      append matab1 .
      CLEAR MFL1.
      CLEAR MFL2.
      CLEAR MFL3.
      CLEAR MFL4.
      CLEAR MFL5.
    ENDFORM.
                     "evaluate
    *&      Form  GET_SOURCE
          text
    FORM GET_SOURCE.
      source-setid = setid.
      append source.
      clear source.
    ENDFORM.                    "GET_SOURCE

    Hi Rammohan,
    You cannot use OPEN DATASET to transfer data from application server to presentation server.
    You can do the following :
    <b>Do 1st point in BACKGROUND</b>
    1. Read the data file from application server into an internal table using OPEN DATASET
    <b>Do 2nd point in Foreground</b>
    2. Once you get the data into an internal table, then use FM GUI_DOWNLOAD to download it on presentation server
    You cannot use the above 2 point together in Background because its not possible. Hence you need program it partially in background and partially in foreground.
    Best regards,
    Prashant

  • Integration connections to bring master data into APO and BW system.

    Hi All,
    Can somebody give step by stip activities to be done FOR creating integration connection for following requriment.
    1)  To bring master data from R3 System to APO system through BI CUBES using BI type connection
    2) APO/R3 type connection R3 system to APO system.
    Regards

    Nikhil,
    Request you to let me know how can basis person helps in this.
      If this is an existing system, the applications team (APO and BW) will not require much assistance from you.  You can mostly sit back and watch them do all the work.
    I suggest that you read the help link in its entirety provided by expert Datta.  Much of the info there is helpful for both solutions.
    For the APO interface (question 2 in your original post) there is a complete step-by-step guide provided by SAP.  Check these two docs out.
    http://help.sap.com/bp_scmv250/BBLibrary/Documentation/B02_BB_ConfigGuide_EN_DE.doc
    http://help.sap.com/bp_scmv250/BBLibrary/Documentation/B05_BB_ConfigGuide_EN_DE.doc
    If you are being asked to select from one of the two scenarios you outlined, the standard Core interface is by far the simpler to implement.  Plus, it has the advantage that SAP will support you when something goes wrong.
    The BW solution is different, in that the developer (you?) MUST have detailed specifications before he can proceed.  Which master data?  At what organizational levels?  How often refreshed?  What will be the ultimate use of the data, and how must it be stored?  Where will it be stored? How to deal with inconsistencies?  Etc etc. SAP support will only solve technical problems in your solution, they won't solve errors of design or logic, unless you happen to get a very accommodating support person when you raise your message.
    But in case a basis person need to create RFC how can it be done for above scenario.
    The instructions for creating all the RFC stuff is contained in the connectivity doc I cited above.
    Best Regards,
    DB49

  • Bringing xml data into flash

    I would like to know how to bring xml data into flash.
    Specifically images that someone can click on the thumbnail and
    view the larger image. please can anyone point me to some tutorials
    or sites that i can start to learn how to do this.
    thanks.

    Hi
    First of all, read every thing you can in Flash Help to get
    to know the different methods of loading XML.
    XML can be loaded with 'built-in' components or parsed in
    Actionscript, the Help files will explain both.
    Next - Google for 'XML Image or Picture Gallery' you'll find
    plenty
    Hope it helps

  • My (Snowleopard) MBP died, but I had it backed up externally with TimeMachine. I bought an Air and used Migration Assistant which worked to bring the data over.  However, I've now got two versions of "me" on Air - I only want one with all my info. Help!

    I know it's generally recommended (if you're going to bring data from one machine to another) to update the OS on the source computer before transfer.  However, my MBP died unexpectedly and my information was backed up with TimeMachine in SnowLeopard.  The Air I bought as a replacement runs Lion.
    I was able to use Migation Assistant to bring my info/programs/etc over...but I've now got the user set up initially (in store) with Lion when I made the purchase - and I now have a second "persona" that contains all of my backed up information from the time capsule.
    Given that the Airs have (relatively speaking) smaller amounts of memory, and that I dont' want to be bouncing back and fourth between profiles, I'm wondering if it's safe to delete the "new user profile" (ie the one set up via Lion while in store) and just use and keep the "me" I brought in from the time capsule.
    I've never updated a MacOS before - let alone under these circumstances, so I've got a fair amount of trepidation around the whole process.
    What is the best way to wind up with one user profile, keeping all the information from my old machine and deleting anything superfluous as part of a secondary profile?
    Thanks in advance for any input you can provide surrounding this process! 

    Thank you! 
    I can't tell you how terrified I was when deleting the superfluous account...but it went just swimmingly!

  • FBL5N & S_ALR_87012168 - Due Date Analysis for Open Items

    Hi,
    As per the above mentioned i could find the that there is differene in the account balances in FBL5N report and due date analysis for customers. Where as FBL1N and the vendor due date analysis figure in the report are same.
    Just want to know why is difference in customer related.
    Require a immediate reply.....please help me
    Thanks in advance.
    jai

    Hi
    I could not understand which report you compared with s_alr_87012172.
    Have you tried matching FD10N and FBL5N.  If not matching, then there could be problem with authorizations in FBL5N.
    From report S_ALR_87012168, you can also drill down to view the line items.  Hence if you drill down and list the line items, it will give you some clue.
    Try whether if any parked documents are included in FBL5N.
    Also ensure that in report S_ALR_87012168, Database is selected in Data sources.
    Regards

  • How to see (net due date + payment term's days) in fbl5n as a date

    hi experts.
    i need some informations about fbl5n fields
    i can see net due date and i can see terms of payment fields in the fbl5n but  if the invoice has a payment term (30 day additional etc) i want to see (net due date + payment term's extra days) . for example if the net due date is 01.06.2011 and payment term is extra 20 days  , how can i see 21.06.2011 in fbl5n or any other sceens?
    Edited by: Burak Akdasli on Jun 22, 2011 3:43 PM

    Hi
    I understand from ur question whats the billing date?
    If this is ur question, you can fetch it from sales order>item>billing tab.
    The logic u had mentioned is confusing be more specifc as to what scenario u are trying.
    Reward if it helped
    Chandru

  • How to get .oce to talk to DAS & bring back data in Interactive Reporting?

    Hi Hyperion Experts,
    This is also a fairly basic question, I use Hyperion 9.3.1 and I have Shared Services, Workspace and Interactive Reporting Studio running on our Windows 2003 Server.
    My goal is to import a .bqy document I created in Interactive Reporting Studio into workspace, and be able to click on the "refresh" button and see it refreshes the data from the backend Oracle 10g database.
    I have completed the following steps so far,
    1. I have created a DSN. This DSN, named "Oracle 10g Data Source", when I clicked on "Test Connection" button, It ran successfully.
    2. I have created a new Datasource entry in the DAS service. If I double click on the newly created Datasource entry, the screen would display the following details.
    Connectivity Type = Oracle Net
    Database Type = Oracle
    Enter the name of the Data source = Oracle 10g Data Source
    Select the name of the Data source (it is grayed out, nothing selected)
    Server/File (it is grayed out, nothing entered)
    then, followed by some default connection parameters.
    3. I have created a "Oracle 10g Data Source.oce" file using the Interactive Reporting Studio. I saved the .oce file at
    C:\Hyperion\BIPlus\data\Open Catalog Extensions\Oracle 10g Data Source.oce
    4. I created a new constrProj_Oracle10gOCE.bqy, during the creation, I chose
    "Create a new document with Recent Database Connection File" option, then I chose the Oracle 10g Data Source.oce that I have created in step 3. After I log in with host user/password, I have created a data model and I see that it brings back a data set when I clicked on the "process" button. I saved this .bqy document.
    5.
    I imported the Oracle 10g Data Source.oce and constrProj_Oracle10gOCE.bqy into the workspace under a folder called
    Root/MYBI/Test
    6. I right click on the constrProj_Oracle10gOCE.bqy document, and chose "Open". It opened with Results section highlighted on the left hand side, and with data showing on the right hand side. However, when I clicked on the "refresh" button, I'd get a message box with a red X. It says,
    An Interactive Reporting Service error has occurred -
    Failed to acquire requested service.
    (2001)
    I guess the data I see in this constrProj_Oracle10gOCE.bqy was just the cached data that was brought back when I hit the process button of this .bqy when it was first created in the Studio, and that it never established any connection via the DAS service to the backend.
    Could someone please show me what I have done wrong here?
    thanks,
    hypuser1010

    EricaHarris,
    Yes, the issue is still open. Obviously, I have gone down the path of "Scenario One".
    Based on your reply, I have done the following.
    There are three things I am configuring here: DSN, DAS and OCE.
    1) When I created the DSN,
    I started the ODBC Data Source Administrator, Click on second tab "System DSN". I added a new ODBC Data Source.
    Data Source Name = Oracle 10g Data Source
    TNS Service Name = //146-abc.xyz.com:1521/MYORCL0617
    2) I re-created the DAS entry, per your suggestion, I made sure that "the name of the data source will be the host name you specified in your OCE".
    Connectivity Type = Oracle Net
    Database Type = Oracle
    Enter the name of the data source= //146-abc.xyz.com:1521/MYORCL0617
    3) for my OCE, I specified
    Connection Software = Oracle Net
    type of Database = Oracle
    on the first screen.
    then, on the next screen, I specified
    Host = //146-abc.xyz.com:1521/MYORCL0617
    I reployed the OCE, the .bqy document and restarted DAS. Yes, my .bqy document does refresh its data successfully from the backend.
    However, I am not 100% convinced that the .bqy uses the OCE which talks to the DAS, which talks to the DSN entry and brings data back to the .bqy document.
    Here is what I did,
    I opened up the DSN entry, changed the TNS Service Name from //146-abc.xyz.com:1521/MYORCL0617 to
    //146-abc.xyz.com:1521/NONSENSE. I then restarted DAS and redeployed the .bqy document. the .bqy document is still able to refresh its data. So, I have just proven that the .bqy document does not need a valid DSN entry to work, how come??
    Can you (or anyone) please explain the phenomenon that I see?
    Also, a very basic question. you know that when you create a .bqy in the Studio, you "import" an OCE into the .bqy document so that the .bqy knows that it should use this OCE to talk to the backend data source. but, when you deploy this .bqy in workspace, how does the .bqy know that it is supposed to go through DAS (and DSN) to interact with the database, rather than the "internal" OCE that was pre-built with this .bqy?
    thanks,
    hypuser1010

  • FBL5N  - incorrect  Due Dates for Downpayment invoices

    Hi.
    On a Down Payment Invoice to customer, the Due Date shown (on Invoice Form) is Correct according to the Terms of payment.  (eg : Baseline date + 8 days)
    However, in the FBL5N, the Net Due Date field shows the Baseline date as due date !
    Why is it so? How do I ensure that FBL5N displays correct due date according to the terms of payment.
    Thanks in advance for your suggestion.
    regards,
    Jai

    Hi,
    I had a similar problem with CREDIT MEMO.
    The System behaves as following:
    a credit Memo without an Invoice Reference is always due on the Baseline Date (the Exceptions to this are Credit Memos which have an explicitly-stated Value Date)". This Logic was introduced starting from Release 3.1I. Please see Details in Note 79871 and 84137.
    Summary:
    1. Credit Memo with Invoice Reference have the same Terms of Payment
    and Baseline Date as the Invoice.
    2. For the Credit Memo without Reference to an Invoice, there are 2
    methods available for choosing the due Date :
    a) The Due Date for Net Payment (RFPOS-FAEDT) is identical to the
    Baseline Date for Payment (BSEG-ZFBDT). The Period for Net Term or
    the Term of Payment is not taken into Account. This is the Default.
    b) Or the Credit Memo is due based on the Payment Terms. For this,
    you must enter the Indicator "V" in the Credit Memo Field BSEG-REBZG
    "Invoice Ref", in addition to the specified Payment Method.
    You can modify this Field for the Credit Memos already posted.
    => the above Details are documented in the Help F1 behind the
    "Invoice Ref" Field.
    Manual filling of the Field can take place during the Entry of the
    Credit Memo, or in a Document Change Transaction:
    Run transaction FB02 to call up the Credit Memos. Drill down
    to Customer Line Item. Here, you should be able to see the Field
    "Invoice ref.' (BSEG-REBZG). Place a 'V' in this Field. With this,
    the Credit Memo is due based on the Payment Terms.
    To automate this process:
    You can use user exit SDVFX001 to fill the value "V" in field
    BSEG-REBZG. Steps to activate the user exit
    - transaction CMOD to create a project, your own naming convention,
    include SAP enhancement - SDVFX001
    - The function module called by SDVFX001 is EXIT_SAPLV60B_008,
    make your ABAP coding in include program : ZXVVFU01. Populate field
    ACCIT-REBZG with 'V'.
    - Activate project via CMOD.
    I hope this could be helpful for you.
    Thanks,
    Marco Vismara
    Message was edited by:
            Marco Vismara
    Message was edited by:
            Marco Vismara

  • Generates Export DataSource  doesn't bring any data

    I have a Export DataSource generated from a InfoCube full of data, but when I execute the IP it deasn't bging any of this data. When I check it in RSA3 it deasn't bering anything either!
    In PROD system the same DataSouce does bring data!! 
    What do I miss here?
    Thanks

    In the InfoProvider Tab, when you right click on any InfoProvider, you have the option to create the "Export Datasource" in order to take the data fomr this provider and bring it any other place and be able to schedule it through the IP.
    Yes, the IP is in Full mode, but thing is that even in RSA3, it deasn't bring anything, even though the data is there!

  • 0COPA_C01 Extraction Brings invalid data

    Hi Friends,
              We are trying to load 0COPA_C01(standard cube )from R/3 data source.when we do a full load it brings some records which is  much lesser than actual.Also the data it brings contains max values as 0.00 and all the key figures are Blank.But when we gave some data selection criteria.It bring Records with matched key figure only for that key field column.but others remain blank or 0.00
    Please help.
    Thanks & Regards
    InduShekhar

    Hi Prasanna
    I thought that u r in to Bw 3.x
    Ny ways in this case u should create a proces chain..
    and schedule it in the start process it self .
    Still there is one ancient option like define the subsequent process in the in fo pak
    schedule once the PSA request is sucessfull then put a Event and then place the
    same event in the start process or scedule process of DTP.
    <u><b>But as per Expert Suggestion U should Go for Process chain.</b></u>
    Thanks & Regards
    R M K
    **Winners dont do different things,they do things differently**
    > Hi R M K thank you,
    >
    > if we do Info Pack -- schedule Tab
    >
    > In the option Start Later in background-- set
    > periodic values
    >
    > Then it will update only in PSA.
    >
    > But if we want to get these Records in info cube
    > we need to Execute the DTP.
    >
    > That is not doing.
    > (selecting the option start later in background and
    > set the periodic values is sufficiant)
    > here you are not discussed for DTP.
    >
    > cause i don't want to load manually.
    >
    > How do we Execute DTP automatically ?
    >
    > thanks
    > prasanna

  • Update routine bringing wrong data in Report

    Hello Experts,
    I have a strange situation on Inventory data. We have update rules from data source 2LIS_03_BF to our inventory cube.There is a characteristic in the cube which was not doing any thing in the past(I mean not mapped),which I am filling it currenlty using a update routine. The cube data looks absolutley fine for that field, I mean its bringing what it supposed to bring from my routine.The cube data looks fine for that field, but when I run the report on this cube,the field data is getting some numbers from some where,which is totally wrong.
    I checked whether there is BADI written on this field (Virtual Characteristic), but seems there is none for this characteristic except for some key figures based on some other characteristics.
    I am confused here as why the report is bringing some wrong numbers when it supposed to get the correct values from cube.
    Please advise,
    Regards,
    JB

    Hi,
    i hope you have checked it for non cumulative objects as there will be a different way of aggregation in that case.
    also, if the inventory is modelled based on cumulative key figures and if you are sure there is no code written for virtual key figures then it could be exception aggregation in the reporting settings or in the key figure general settings.
    Also during checks you are considering the historical data as well from the cube and its effect on the report output.
    just do a check and let us know.
    regards
    Ajeet

  • Record Number of users logged in to Portal and bring the data in to BI

    HI,
    I have a requirement for identifying the number of users logging in to SRM Portal and bringing those information in to BI and reporting them
    Can any one please tell me, how do i need to find those
    I checked tables USR01, USR02, USR03, USR41 etc but none of them gives the proper user/server information needed,
    SM04 transaction logs both user logging in to SRM server as well as SRM portal, but i need to find the underlying table for SM04, with which i can get the number of users logged in to the portal on a particular date range.
    If not, can anyone please tell me what is the best way to identify number of users logged in to SRM portal on a particular date and capturing those information and bringing in to BI and reporting them in BI
    Thanks in advance
    Pooja
    Edited by: Pooja on Nov 25, 2008 5:10 PM

    Please check the below link. should guide you in identifying the number of users logged into Portal which can be used for your reporting.
    [/message/319314#319314 [original link is broken];
    Thanks,
    Sruthi

  • Schedule line bringing todays date altough stock available earlier(backlog)

    Here is the issue,
    The project is in the backlogging stage.On the mm side the postings are done for august and plenty of stock is available for sales orders to be created.
    Now when we create sales orders we basically change the pricing date, document date and required delivery date for the backlogging of month august.
    When the avaialbity check runs it brings schedule as given below
                                  order qty    roundqty        confirmed         confirmed qty
    material X                      50            50                    0                  15-Aug-2010
    material X                      0              0                      50                Todays Date
    Altough the stock is avaialble for month of august and this is the first order that I created on the production system so no chance of stock being resewrved for other orders.
    Need a quick solution.Thank you

    Hi,
    I think that ATP will never confirm in the past, as it doesn't make any sense from an ATP point of view.
    If it is not what you need, then just dissable ATP, and the system will confirm anything on the required date.

  • Need to bring out data from Infocube to Flatfile

    Hi All,
    I have a cube with historcal data in one system.
    Now my requiremnet is to creat a exactly similar cube in the other system, and need to load the historical data.done.
    we are considering the below Options:
    1) Fetching the data from the cube( From First System) in to a Flat file and load into New cube (other System).
                                             or
    2) Going For a Open hub to bring out the data into Flat file and load the data into new cube in the othersystem.
    Note: Since OpenHub is the tool which we need to buy ( we are thinking. we are not sure)
    There is no kind of relation ship between the 2 systems to copy the cube from one system to other system.
    Please suggest the best options to get it.
    Regards,
    Sri

    Thank you all for your prompt responses.......
    Actually the copany got split into two. one of them have  kept aside the cube with other company's information.
    So they are not even ready to provide display access to their system. So creating RFC connection and Using APD to fetch the data to a flat file might not do.
    What we are thikinng is Fetch the data from From the Cube output into a spread sheet and load that into the new cubs.
    we have similar concern with Master Data Objects, we need to take out the attributes, Text and Heirarchies data and load in to new system.
    Here how do bring out the data for Master data attributes, Texts and Hierarchies? does this work similarly with these how we do with the cube?
    Especially with Hierarchies?
    Please suggest
    Regards,
    Srikanth

Maybe you are looking for

  • How to get the original file name in chinese with upload component?

    It is inconvenience for user to upload file with name only in english. When upload a file with chinese file name, the file name gotten from "fileUpload1.getUploadedFile().getOriginalName()" is just like "缃戠粶鍥�.jpg". Do somebody have a good idea?

  • APxpress "N" with Epson 3800 working however, "Add Printers" will not work.

    The System Pref "Add Printer" sees the printer hanging off my AirPort Express N via USB, my network sees the Airport Express fine, I checked under the APEx and it sees the 3800 and it should be fine but When I "ADD" the printer "I am getting this mes

  • IDoc object filter of BAPI

    Dear all,    I want to add several fields in object filter of BD64, but there are some confusions due to I used a BAPI message type 'PROJECT'. It seems like different from normal message type. After i add items in BD59 for the message type 'PROJECT',

  • Veritas with Oracle RAC

    Some days long ago one of my senior DBA invoked some GUI tool which was like web page for monitoring resources of Veritas SFRAC. I would like to know how to find URL for this?? -Yasser.

  • Networking OS 10.4.10 and OS 8.6

    Before I upgraded from OS 10.3.9 to 10.4.1, I was able to access my older OS 8.6 machine via ethernet with no problems. Since the upgrade, I cannot make a connection from my eMac to the older machine and keep getting "An AppleShare Error Occurred" me