Datasource and Data flow for SAP table AFVC and field AUFPL

Hi
       During mapping of SAP fields to datasource field we have come accross AFVC table and field AUFPL .I am not able to find datasource for above mentioned table or field.Can any one please help me out ?
Thanks
Datta

Hi DLK,
You can search using SE11.
goto SE11 in ECC system, give the table name and click on where used list.
in the next screen select structure and press enter.
in the result screen you will get the list
Regards,
Venkatesh

Similar Messages

  • Data Validation for SD Sales order and Deliveries

    Hello All,
    We activated the data flow for SD sales Order and Deliveries for Infocubes 0sd_co4 and 0sd_c05 and there correspoding datasources.
    I have 2 queries
    1. How to validate data for BI SD sales order with ECC.
    2. How to validate data for BI SD Deliveries data with ECC.
    Kindly help, <removed by moderator>.
    Thanks,
    Rakesh.
    Edited by: Siegfried Szameitat on Jun 21, 2011 12:51 PM

    Hi Rakesh,
    You have to check the tables in R3 and then compare with the Datasources in BW.
    For Examples
    VBAK table in R3 for Sales Order on BW-2LIS_11_VAITM
    VBRK table in R3 for Billing iin BW-2LS_13*
    Check for that day how many sales order were there in VBAK table and there in BW.
    Rgds
    SVU123
    Edited by: svu123 on Jun 21, 2011 8:19 PM

  • Creation of a generic extractor and data source for the FAGLFLEXA table

    Hi All,
    Need to create a generic extractor and data source for the FAGLFLEXA table to support AR reporting. This table contains the necessary profit center information to perform LOB reporting against the AR data.
    Please advice on how to do this.
    Regards, Vishal

    Hi Vishal,
    Its seems a simple a work out.
    1.Go to RSO2 & choose the relevant option ie. whether you want to create Transactional DS, Master Data DS or Text DS.
    2. Name it accordingly & then create.
    3. Give description to it & then give table name FAGLFLEXA.
    4. Save it & activate. If you need it to be delta enabled then click over Delta & you can choose accordingly.
    If you still face some problem then do mail me at [email protected]
    Assign points if helpful
    Regards,
    Himanshu

  • BUG: Export DDL and Data fails for mixed case table/column names

    Hi there,
    I have found a bug in SQL Developer. See details below.
    Description:
    When "Export DDL and Data) function is used on a table/columns not named in UPPERCASE, sql generated by SQL Developer is invalid.
    Steps to reproduce:
    - open SQL Developer, connect to DB
    - make a table named "lowerCase" (in double quotes, so it won't be automatically changed to capital letters)
    - you may also add some columns, for example "lowerCol1", "UpCol2", ALLUPCOL3
    - add some data rows to the table
    - choose Tools -> Export DDL and Data
    - check exporting of tables and data, on "filter" tabs choose your "lowerCase" table
    - press "Apply"
    Error:
    Generated SQL contains invalid INSERTs: mixed-case table and columns are referenced without obligatory double quotes, which yields an error when generated script is executed (see below, relevant line is underlined)
    -- DDL for Table lowerCase
    CREATE TABLE "DBO_HT"."lowerCase"
    (     "lowerCol1" VARCHAR2(100),
         "UpCol2" VARCHAR2(100),
         "ALLUPCOL3" VARCHAR2(100)
    -- DATA FOR TABLE lowerCase
    -- FILTER = none used
    -- INSERTING into lowerCase
    Insert into lowerCase (lowerCol1,UpCol2,ALLUPCOL3) values ('lc','uc','auc');
    -- END DATA FOR TABLE lowerCase
    Remarks
    SQL Developer: version 1.2.1, build MAIN-32.13
    Oracle DBs: 9.2 & Express
    OS: Windows 2000 Professional
    If you need any more details/testing, let me know. I'd really appreciate a quick patch for this issue...
    Alternatively, do you know of any other simple way of copying a single database (it's called a schema in Oracle, right?) from one computer to another? Possibly something so simple like detaching->copying->reattaching mdf (data) files in SQL Server... I thought that this "Export DDL&Data" function will do, but as you can see I couldn't use it.
    I just need a simple solution that works - one operation on source to stuff, get the resulting files to other computer and one operation to have it running there... I think that such scenario is very basic, yet I just can't achieve it and I am simply not allowed to spend more time on it (read: our test project fails, my company rejects my "lobbying" and stays with MSSQL :/ )
    Thanks a lot & bye

    Thanks for your reply.
    ad. 1)
    You're right. I just wanted to give some very short feedback on my experiences with SQL Developer, so I didn't think starting new threads would be necessary, but as I was writing it became much bigger than I initially planned - sorry about that. I will make proper threads as soon as possible. Having "Edit post" button on this forum would also be useful.
    ad. 2)
    Generally, you're right - in most cases it's true that "switching DBMS is a major commitment" and "you will produce terrible code" if you don't learn the new one.
    However, I think that you miss one part of market here - the market that I think Express is also targeted on. I'd call it a "fire&forget databases" market; MySQL comes to mind as possibly most common solution here. It's the rather small systems, possibly web-accessed, whose data-throughput requirements are rather modest; the point is to store data at all, and not necesarily in fastest way, because given the amount of data that is used, even on low-end hardware it will work well enough. What's important here is its general ease of use - how easy is to set up such system, connect and access data, develop a software using it, how much maintenance is needed, how easy this maintenance is, how easy are the most common development tasks as creating a DB, moving a DB from test to production server etc. There, "how easy" directly translates to "how much time we need to set it up", which translates to "how much will the development will cost".
    Considering the current technology, switching the DBMS in such systems is not necesarily a major commitment and believe me that you will not produce terrible code. In many cases it's as simple as changing a switch in your ORM toolkit: hibernate.dialect = Hibernate.Dialect.OracleDialect vs MySQLDialect vs MsSql2005Dialect
    Therefore, in some part of market it's easy to switch DBMS, even on project-by-project basis. The reason to switch will appear when other DBMS makes life easier => development faster. From that point of view, I can understand my colleagues giving me an embarassing look and saying "come on, I won't read all these docs just to have db copied to test server". And it doesn't mean "they are not willing to learn anything new", it's just that they feel such basic task should have self-explaining solution that doesn't require mastering any special knowledge. And if they get such simple solutions somewhere else, it costs them nothing to change the hibernate dialect.
    I think Oracle did the great job with introducing the Express to this "fire&forget" market. The installation is a snap, it just works out of the box, nothing serious to configure, opposite to what I remember from installing and working on Oracle 9 a few years ago. In some places it's still "you need to start SQL*Plus and enter this script", but it's definitely less than before. I also find the SQL Developer a great tool, it can do most of what we need to do with the DB, it's also much better and pleasant to use over Oracle 9 tools. Still, a few basic things still require too much hassle, and I'd say taking your schema to another machine is one of them. So I think that, if you do it well, the "schema copy wizard" you mentioned might be very helpful. If I was to give any general advice for Express line of DB/tools, I'd say "make things simple" - make it "a DB you can't see".
    That's, IMHO, the way to attract more Express users.

  • How to find unsued Transfer Rules and Data Sources for a Master Table...??

    How to find unsued Transfer Rules and Data Sources for a Master Table...?? My requirement is i need to delete those Transfer rules and Data Sources which are not in use

    Hi
    Go to manage of the Text or attirbute of the master data object, see what are being loaded daily from there and delete the remaining.
    Cheer
    Ans as expected, Please reward

  • Missing data packages for PSA Table in sap netweaver 2004s

    When we tryed to load the master data 0customer we have got the message
    "Information Idoc: Sent, but did not arrive:
    Missing data packages for PSA Table
    Diagnosis
    Data packets are missing from PSA Table . BI processing does not return any errors. The data transport from the source system to BI was probably incorrect.
    We couldn't find any entries in source system.We have checked the tablespaces and authorizations.Its looks good.
    WE have replicated and tryed again.But still getting the same error message.
    we have SAP_BASIS
    Release 700,level 0008, Highest SAPKB70008 basis component.
    pls help us with the solution.
    Thanks in advance.
    -Soujanya

    Hi
    1) goto BD87 Tcode... take ur Idoc number and check the status of the idoc if it is 64 .. just process it manually ....
    2) go to the TRFC in Datawarehouse and execute the LUWs manually ..
    hope it helps
    regards
    AK
    ************assign points if usefull******

  • How to list column names and data types for a given table using SQL

    I remember that it is possible to use a select statement to list the column names and data types of databaase tables but forgot how its done. Please help.

    You can select what you need from DBA_TAB_COLUMNS (or ALL_TAB_COLUMNS or USER_TAB_COLUMNS).

  • What is the Data flow in SAP-BPC NW

    Hi All,
    Could any body explain me how the data is flowing in SAP-BPC NW.
    and what are the steps for Planning
    and what are the steps for Consolidation
    Regards,
    Setty

    Hi,
    1) Attributes describe master data like a customer. The country where the customer resides, the industry etc. describe and classify the customer. Attributes are not just the structure/metadata of the characteristics. They are also the content. The following table should visualize what i mean:
    customer-no. | name
    112 | mcpherson inc.
    113 | donalds inc.
    2) I think, you are asking about the laods from infopackage. We dont have types of infopackage. But we have one thing called Infopackage group, You can create the IPG with the selction of all infopackages in RSA1-Modelling Area to automate those loads automatically. That means it will trigger only those IP's which have been pooled up in that IGP...
    Basically there are three types of loads in Infopackage
    Full Upload: This will bring all the records in the BW irrespective of how many times you load.
    Init Delta: In this you have 2 options: a) Init with data transfer and b) Init without data transfer. Init without data transfer will just setup your delta and when you do the delta load it will bring all the records. Init with data transfer is similar to the full upload. This brings all the records for the first time and will also setup the delta process. Once the Init is done; you have to do delta for all the subsequent loads.
    Delta: This will bring only the changed / modified / added / deleted records from the source to BW.
    Hope it helps..
    Regards
    Lavanya

  • Problem with context mapping and data flow in a FPM application

    Hi All,
    I am trying to develop an ESS application using FPM. For the same, the requirement is to see the history of an employee in the second view.
    The first view has got just the overview information and the second one has got the detail. So, the records or the fields are the same on both the views.
    As per the FPM guidelines, the Model is residing in the Fc component and the respective Vc components are using the model data accordingly.
    I am executing the model in the Fc component calling the executable method in the interfaceController of the first view and then trying to display the output data of the BAPI in the first view which provides the overview information.This is working fine.
    But when i am trying to map the same output node to the Table UI for the second view, the record size is coming zero and thus no information is available.
    For the above issue, I am again executing the RFC in the InterfaceController of the second view to populate the records, which is incorrect as it is already executed and the data is available for the first view.
    I request you to let me know the correct approach to Context mapping and data flow when using FPM-roadmap. Is their any standard method or approach available to deal with such requirements? Please let me know.
    Thanks in advance.
    Regards
    DK

    Hi Idhaya,
    I model node is available in Fc and the Fc interface controller is being used in the first Vc and the second Vc.
    So the idea is, as the executable method is generated in the Fc, so i have created a custom method to call the executable method in Fc, where the input parameter is getting passed and this custom method is finally getting called is the first Vc.
    So , now my first Vc is ready to call the custom method in Fc and execute the RFC. Once the RFC is executed, the nodes in the Fc should get populated which is the ideal case.
    And as the Fc is used as a component in the second Vc, the same node is available to the UI elements.
    But, when I check the record size for the output node, it is always zero, for the second Vc.
    Regards
    DK

  • Data area for accessing table is too small., error key: RFC_ERROR_SYSTEM_FA

    Hi all,
    I build a java applicatio to call a sap function.
    This FM have only an import parameter as structure, the last field of this structure is 16000 characters long.
    When I start the application if the long field is empty all works fine, but if I fill it the java compiler send me this runtime error:
    [code]
    Exception in thread "main" com.sap.aii.proxy.framework.core.BaseProxyException:
    Data area for accessing table is too small.,
    error key: RFC_ERROR_SYSTEM_FAILURE
         at com.sap.aii.proxy.framework.core.AbstractProxy.send$(AbstractProxy.java:150)
         at bi9032.BI_9032_PortType.zhr_Bi_9032(BI_9032_PortType.java:16)
         at bi9032.Startapp.main(Startapp.java:50)
    [/code]
    Any one can me explain the problem?
    It's possible that I can't pass a large data quantity?
    thanks and regards,
    enzo
    Message was edited by: Enzo Porcasi

    I understood that it's a sap problem,
    so I will write in the abap forum,
    bye
    enzo

  • Data Models and Data Flow diagrams.

    Hi  Gurus,
        Can anybody brief me the concept of Data Models and Data Flow Diagrams and their development with illustrations. And is it a responsibility of a Technical or a Functional consultant..i.e to translate Business requirements and functional specifications into technical specifications, data flow diagrams and data models.
    Your valuable answers will be rewarded.
    Thanks in advance.

    Hi,
    Concept of Data Models
    Data model or Data modelling is basically how you define or design your BW Architecture based on Business requirements. It deals with designing and creating a effcient BW architecture sticking to standard practices.
    Multi-Dimensional Modeling with SAP NetWeaver BI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/6ce7b0a4-0b01-0010-52ac-a6e813c35a84
    /people/githen.ronney3/blog/2008/02/13/modeling-strategies
    Modeling the Data Warehouse Layer with BI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3668618d-0c01-0010-1ab5-aa75c3a4dfc2
    /people/gilad.weinbach2/blog/2007/02/23/a-beginners-guide-to-your-first-bi-model-in-nw2004s
    Data Flow Diagrams
    This show the path of data flow for each individual object in BW. How data dets loaded into that object and how it is going out of the object etc.
    Right click on the data target > show data flow .
    It shows all the intermdeiate layer through which data comes into that particular object.
    Responsibility of a Technical or a Functional consultant
    This is done genrally in the designing phase itself by a Senior Technical Consultant with the help of a Functional consultant or a Techno=Functional consultant interacting with Business.
    Hope this helps.
    Thanks,
    JituK

  • Data Flow in SAP BI - Please Help

    Hello All,
    I have to prepare a Data flow in SAP BI for one of the Insurance system application.
    There are 4 flat files coming from the source. Each flat file has records on insurances details like policies and so on.
    Each flat file represents a line of business. Now the data model is newly being made. There will also be historic data for last three years for reporting and current year data is loaded in monthly basis.
    Also the error records will go in different target for reporting for each line of business.
    Please suggest a data flow which which would be made in best.
    Edited by: Syed786 on Oct 23, 2009 5:09 PM
    Edited by: Syed786 on Oct 26, 2009 11:31 AM

    Hi sridhar
    1) Sender Adapter & Sender agreement & communication channel
    2) receiver determination
    3) interface determination
    4) message branch
    5) receiver agreement & communication channel
    6) call inbound adapter
    Regards Mario

  • Data flow task error failed validation and return validation status "VS_NEEDSNEWMETADATA"

    I have ETL with ~800 tables that I moving from Oracle to SQL Server (Prod Oracle -> Prod SQL)
    Now the Oracle/SQL new version was came from vendor that I need to test, and for that I created new DEV environments for Oracle and SQL , the update includes updated new columns in exists tables and new tables . (DEV Oracle -> DEV SQL)
    So what I tried to do is to take the old ETL(PROD) to change the connection to DEV servers.
    Then I executing the packages from local laptop it's working, and if I trying to execute the packages from job schedule it's giving me errors : "Data flow task error failed validation and return validation status "VS_NEEDSNEWMETADATA"
    I went to each table to check the columns if something different, and I was dropping some of the tables and recreated them in the destination but the error still shows. I also tried to change the package to "DelayValidation" to True but without
    success.

    I do not understand the difference between "... if I going to change the Connection Manager to new connection" and "didn't change the Connection Manager, only changed inside the Server name / user/ pass" 800 tables.
    What I see is some tables your packages sees in Dev (laptop) is not of the same schema once the package is deployed hence the metadata error.
    Arthur
    MyBlog
    Twitter

  • Temporary data carriers for Database tables

    hi Gurus,
    How can we find temporary Data Carriers for database tables. For eg.   when we craete a BOM in CS01/CS02 the data first goes to RC29P structure and when system finishes it goes to STPO.
    This is one example. how canw e find other structures and theirs corresponding Database Tables?
    Any idea?
    Regards

    Hi Greg,
    Those are the aggregated tables. If they are not filled with data although the Portal Activity Report is activated, you should check whether the aggregation finished successfully.
    In the older SPs there were some problems that were fixed in later SPs of 7.0.
    In order to have the latest version of Portal Activity report, you can check SAP note 1084379 - Portal Activity Report - Latest Version (SDA file).
    You can compare the SP via the MANIFAST file, as it contains the version and SP number.
    In order to troubleshoot problems in Activity Report, you can follow SAP note: 1690023 - Portal Activity Report - Component-specific Note
    Some basic checks that you can do:
    Run query on the raw data tables to check since when there is data in those tables:
    select min(timestamphour) from SAP<SystemID>DB.WCR_WEBCNODESTAT;
    If there is too much data from long time ago, then you should delete the old data, and leave only the new data (there is anyways retention time for the data to be kept).
    If there is no data, then it means that the Portal Activity Report does not collect data, and not really activated (usually this is not the case).
    The aggregation runs every top of the hour, so you can check in the default traces for an error during that time.
    In most of the cases something went wrong while aggregating the data.
    As a result the aggregation is not finished, so the transaction is not being committed, and the aggregated tables stay empty.
    If there is a DuplicateKeyException in the trace, you can follow SAP note 1054145 - Duplicate Key Exception.
    If you have any more questions, please don't hesitate to ask.
    I hope this information helps,
    Thanks & regards,
    Michal Zilcha-Lang

  • Gathering Technical and Business Requirements for SAP PI

    I need to some documents and tools, or interview guide & questionnaire to gather the stakeholders' technical and business requirements for SAP PI

    Thanks for the links,
    I am quite new at SAP and currently trying to elaborate should i use SAP PI or not based on my project stakeholder's requirements.
    And my biggest problem right now is how to capture technology and business requirements and match them with SAP PI
    I plan to use this format while requirements gathering about integration platform:
    1- Draft Requirements
    Q- What is the names and status of sending and receiving application and how they will be used
    2- Detailed Requirements
    Q- Connectivity, Administrative and Business related questins
    3- Administrative and project management information
    Q- Which applications are involved, who is the original supplier? Who is technical contact at the supplier
    4- Business requirements
    Q- Questions about the information flow which will support some business processes
    5- Connectivity to applications
    Q- Questions about connectivity mechanism, file formats (flat file or XML or something else), data access methods and so on
    What do you think about the general concept?
    Edited by: Alper Celik on Feb 2, 2009 10:37 AM

Maybe you are looking for

  • Waveform chart-drop in pressure

    Hi,      From couple of days I am working on a code to read pressure drop in continuosly running waveform chart. I checked lot of examples and in the whole forum but I am unable to find anything close to what I need. I need to read two points from th

  • ADF Menu Model and page flows

    Hi! I created two Unbounded task flows. One task flow for ADF sub_menu (adfc-paraugs-config.xml) and second task flow for ADF root_menu (adfc-config.xml) (see http://my.jetscreenshot.com/2677/20100708-nite-111kb). How can I map records between two un

  • MongoDB Source

    HI, Is there any functionality available in ODI to extract data from MongoDB and load it into oracle tables. Thanks.

  • HD FOOTAGE TO DVD

    Could somebody advise me of the best way to create a standard DVD from 1080i/60 HDV (29.97fps). The final DVD will be played on a PAL television/DVD player. I have tried to do this using my normal workflow (used for SD footage) but I keep getting jud

  • Opatch lsinventory

    Hi Friends, I am really new to oracle and unix. Can you please help me find out OC4J and BPEL Versions with help of opatch lsinventory command? where(which directory) i need to execute it and how(exact command)? Oracle Home directory : /dboracle/orab