F4 help and data flow

Hi,
             I created online sale order interactive form using ABAP. In web dynpro component I maintained Display type as native. In the form Layout type as ZCI layout. In form layout I drag and drop the Value help button and Submit button form web dynpro native tab. Now buttons are working but data flow is not happening so I am not able to create sales order.
            But if I use Display type as ActiveX ,form Layout as Standard, Buttons From Webdynpro activeX. In this case f4 help is not working but data flow was happening so I am able create sales order. I need f4 help and as well as I should able to create sales order.
So please help me.
Thanks & Regards,
Krishna,

Hi Mohan,
For ZCI Layout in the Designer have you inserted the WebDynpro Script. If not goto Layout Designer and in the WebDynpro Tool Bar goto Utilities->(select) Insert WebDynpro Script.
To check whether the Script is inserted or not, goto Palette->Heirarchy of Adobe Form Toolbar and in Heirarchy scroll down to Variables, in the variables you'll find one script object i.e. "containerFoundation_JS". If this is present then it will work.
If it is not inserted then use the Report FP_ZCI_UPDATE.
Regards
Pradeep Goli

Similar Messages

  • Data Models and Data Flow diagrams.

    Hi  Gurus,
        Can anybody brief me the concept of Data Models and Data Flow Diagrams and their development with illustrations. And is it a responsibility of a Technical or a Functional consultant..i.e to translate Business requirements and functional specifications into technical specifications, data flow diagrams and data models.
    Your valuable answers will be rewarded.
    Thanks in advance.

    Hi,
    Concept of Data Models
    Data model or Data modelling is basically how you define or design your BW Architecture based on Business requirements. It deals with designing and creating a effcient BW architecture sticking to standard practices.
    Multi-Dimensional Modeling with SAP NetWeaver BI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/6ce7b0a4-0b01-0010-52ac-a6e813c35a84
    /people/githen.ronney3/blog/2008/02/13/modeling-strategies
    Modeling the Data Warehouse Layer with BI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3668618d-0c01-0010-1ab5-aa75c3a4dfc2
    /people/gilad.weinbach2/blog/2007/02/23/a-beginners-guide-to-your-first-bi-model-in-nw2004s
    Data Flow Diagrams
    This show the path of data flow for each individual object in BW. How data dets loaded into that object and how it is going out of the object etc.
    Right click on the data target > show data flow .
    It shows all the intermdeiate layer through which data comes into that particular object.
    Responsibility of a Technical or a Functional consultant
    This is done genrally in the designing phase itself by a Senior Technical Consultant with the help of a Functional consultant or a Techno=Functional consultant interacting with Business.
    Hope this helps.
    Thanks,
    JituK

  • Problem with context mapping and data flow in a FPM application

    Hi All,
    I am trying to develop an ESS application using FPM. For the same, the requirement is to see the history of an employee in the second view.
    The first view has got just the overview information and the second one has got the detail. So, the records or the fields are the same on both the views.
    As per the FPM guidelines, the Model is residing in the Fc component and the respective Vc components are using the model data accordingly.
    I am executing the model in the Fc component calling the executable method in the interfaceController of the first view and then trying to display the output data of the BAPI in the first view which provides the overview information.This is working fine.
    But when i am trying to map the same output node to the Table UI for the second view, the record size is coming zero and thus no information is available.
    For the above issue, I am again executing the RFC in the InterfaceController of the second view to populate the records, which is incorrect as it is already executed and the data is available for the first view.
    I request you to let me know the correct approach to Context mapping and data flow when using FPM-roadmap. Is their any standard method or approach available to deal with such requirements? Please let me know.
    Thanks in advance.
    Regards
    DK

    Hi Idhaya,
    I model node is available in Fc and the Fc interface controller is being used in the first Vc and the second Vc.
    So the idea is, as the executable method is generated in the Fc, so i have created a custom method to call the executable method in Fc, where the input parameter is getting passed and this custom method is finally getting called is the first Vc.
    So , now my first Vc is ready to call the custom method in Fc and execute the RFC. Once the RFC is executed, the nodes in the Fc should get populated which is the ideal case.
    And as the Fc is used as a component in the second Vc, the same node is available to the UI elements.
    But, when I check the record size for the output node, it is always zero, for the second Vc.
    Regards
    DK

  • SSIS Control and Data Flow Items not available in Customize Toolbox

    I cannot add additional Data Flow and Control Flow items to the SSIS Toolbox. I am using SQL Server 2014 and VS 2013 (BIDS). The standard SSIS items appear - but they are all of them. How do I add the additional ones that are not loaded by default. I didn't
    have this issue with SQL Server 2012.
    Thank you,
    Johhn
    John

    John, which ones you do not see? AFAIK nothing has changed regarding how the items get added to SSIS toolbox in SSDT or VS 2013 with SSDT installed (no BIDS in SSIS 2012 and up).
    Arthur My Blog

  • Problem  with search help and date fields

    Dear experts,
    I have two text fields and to each i assigned cacs_calendar search help.
    It works well normally but if i make text box output only then i cannot select date.
    I want that text box in its disabled form can be used to select date from cacs_calendar search help
    that i assigned.User should not provide manual input which means fiedls should be otherwise listed in grey.

    Hi Aditya
    If a I/P output field is provided an attribute as output only and though search help is provided , the values in the search help list will also be in read-only mode and u cannot select them at all, may be you can solve ur problem thru different approach.
    when a manual entry is done with wrong value which is not present in F4 help/search help list and
    execution is done SAP will by default throw error saying invalid value

  • Web Services and data flow on Multicast

    Need to find out a few options for how a Web Service application could send data (XML) to a multicast configuration (PIM).
    I first thought about SOAP over some protocol like HTTP but wanted to get more input and ideas based on other experiences.
    Also, what happens at the receiving end? Do I need a listening socket or something?
    Any ideas or real-life experiences (even if not in a Multicast environment) are greatly appreciated!
    Regards,
    Arie.

    Are you binding the data table component to the SearchResult[] array? Did you create a property for it in your page bean (.java file)?
    Are you binding the different columns' components with the various getters from SearchResult[]?
    For example, the binding should be of the form
    #{currentRow.bookText} or #{currentRow.verseText}
    etc.
    --Gail                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Datasource and Data flow for SAP table AFVC and field AUFPL

    Hi
           During mapping of SAP fields to datasource field we have come accross AFVC table and field AUFPL .I am not able to find datasource for above mentioned table or field.Can any one please help me out ?
    Thanks
    Datta

    Hi DLK,
    You can search using SE11.
    goto SE11 in ECC system, give the table name and click on where used list.
    in the next screen select structure and press enter.
    in the result screen you will get the list
    Regards,
    Venkatesh

  • 4503/4506 - data handling and data flow

    I have been tasked to find a document that details how, from start to finish, a packet goes through a 4503/4506 unit. I'm talking a level of detail that includes what portion of RAM the packet goes into once it hits the inbound interface, what parts of the switch handle the analysis (ACLs, et all), and so on, right until the packet is either dropped or forwarded to the outgoing int. As detailed a description as possible, and if a non-model-specific equivalent is available and applicable to this unit, that works as well.
    I have been looking through the TechDocs and the like, as well as several attempts at Google (which is well-nigh useless), and no luck thus far.
    Thanks in advance for any information provided.

    I am not aware of any CCO documents explaining path of a packet/CAT4500 architecture. However, there was a presentation on this at Networkers 2005. If you attended it, you can check it out at
    http://www.cisco.com/networkers/nw05/nwol.html
    Here is the session information for RST-4500.
    Session Title: Cisco Catalyst 4500 Switch Architecture
    Length: 2 Hours
    Level: Intermediate
    Related Sessions:
    RST-3031 Troubleshooting LAN Protocols
    RST-3042 Troubleshooting Cisco Catalyst 4000 and 4500 Series Switches
    Abstract: This session presents an in-depth study of the architecture of the Cisco Catalyst 4500 Switches and how the various components work together. The focus for this session is present information to help the audience understand the architecture to be able to design and implement Catalyst 4500 in their network and troubleshoot better.
    Topics include a discussion of the architecture, information about the latest Cisco Catalyst 4500 supervisors and switching modules such as SupevisorV-10GE, Netflow Feature card (NFL), Catalyst 4948, Supervisor II+TS amd PoE linecard, as well as the key features such as CEF/Multicast Forwarding, DHCP Snooping, IP Source Guard, Dynamic ARP inspection, 802.1X, Redundancy (SSO), Netflow, ACL/TCAM, QoS (Per-port/Per-VLAN and UBRL),
    This session is designed for network designers and senior nework operation engineers considering deploying or have Cisco Catalyst 4500 series of switches in enterprise and service provider networks.
    * Prerequisites:
    1. Knowledge of LAN protocols is required
    2. Basic understanding of Cisco Catalyst switches is required.
    Topics include a discussion of the architecture, information about the latest Cisco Catalyst 4500 supervisors and switching modules such as SupevisorV-10GE, Netflow Feature card (NFL), Catalyst 4948, Supervisor II+TS amd PoE linecard, as well as the key features such as CEF/Multicast Forwarding, DHCP Snooping, IP Source Guard, Dynamic ARP inspection, 802.1X, Redundancy (SSO), Netflow, ACL/TCAM, QoS (Per-port/Per-VLAN and UBRL),
    Speakers: Balaji Sivasubramanian
    Escalation Eng
    Cisco Systems
    Balaji Sivasubramanian is an escalation engineer in Cisco's Gigabit Switching Business Unit. Balaji, who is a CCNP, is also co-author of "CCNP Self-Study: Building Cisco Multilayered Switched Network - 2nd Edition" (ISBN -1587051508). Balaji is an expert in Catalyst 4500 switches architecture, and in troubleshoooting LAN protocols and Catalyst switches including the Catalyst 4500, Catalyst 6500 and Catalyst 3500. In his 5+ years with Cisco, Balaji has also held positions of TAC Technical Leader/Expert in LAN/Campus switching, Worldwide Subject Matter Expert in LAN technologies for the Cisco TAC and a TAC support engineer in LAN/Campus switching.

  • Ssis data flow item and ssis control flow item tab missing in choose toolbox item from ssdt 2010

    ssis data flow item and ssis control flow item  tab missing in choose toolbox item from ssdt 2010

    I have the same problem.
     When i click on tools -> choose tools Items dialog box, Control Flow and Data Flow Tab are missing.
    I've just worked with SQL Server Data Tools and SQL Server 2012 and these tabs are not missing.
    i think this is a problem installing SQL Server.
    I've not yet  found a solution.

  • SSIS Data Flow Question

    While working on my current project, a problem came to mind.
    On past projects in SSIS, I was taking data from an Excel or csv source and converting for use in SSMS. Straightforward and fairly simple, as I became more and more acquainted with SSIS.
    But, the stored procedure I'm working on now raised a question on how data flows through a project.
    The procedure updates a couple of tables using SQL's UPDATE command. So, it does a direct modification of a table.
    However, in duplicating that in SSIS - using Derived Column tools - I started to wonder what the data was doing as the flow progressed through the various steps in the Control and Data flows.
    Here's my Control Flow so far;
    In Step 1, the SQL table, sh1, has all records that meet the WHERE criteria of the embedded SQL code deleted, but leaving the rest of the table intact.
    In Step 2, I define a variable for use in the project using an embedded SQL query.
    In Step 3, I encounter the first update task;
    This flow terminates in the lower right corner with Update sh1, Pass 4. Here is where I have my question.
    When this flow terminates, does it return to the Control Flow with the table sh1 updated as required by the
    Update sh1 Data Flow? In other words, does it enter Control Flow Step 4 with the updates, or does the next update task (one in which the stored procedure uses the modified table, sh1) simply reload sh1 from the server, without any updates?
    If the former, then no worries, as that's how I was proceeding until a little bug in my ear said, "Whoa, buckwheat! Maybe the changes aren't stored."
    If the latter, then I'm guessing that I need to have a Data Flow Destination tool dragged to the workspace and connected up to the output of the
    Update sh1, Pass 4 tool.
    Fine, I thought, I'll direct the output to the sh1 table.
    But, that will put all of sh1 into the table, not just the updated rows. I'd have to TRUNCATE sh1 first. But, sh1 is active in the Data Flow. If I truncate it, there'll be nothing to update. Or, if I truncate it after the
    Update sh1, Pass 4 tool, it'll lose all the updates - and everything else.
    I figured that, in the latter case, I'd have to have some sort of interim table to hold the results of the
    Update sh1 Data Flow, truncate sh1 and then put the contents of the interim table into the, now empty, sh1 before returning control back to the Control Flow.
    I'd like to avoid an interim table, if at all possible, but if I have to have one, I have to.
    Or, is there another way?
    Thanx in advance for any help!

    You can always use a Record-Set Destination to hold the results from your first Data Flow Transformation. The record set variable is retained in memory and available within the context of your package
    Object Variables Recordsets
    But then you need to modify  your package to read from the record set. you can either use a script task , or loop through each record in your record set and process the updates for each row
    Shredding Record Set
    While these are possible options, i do not think they are the best way to go about things.
    I would like to know what you are doing in the Update Sh2 Step. You can keep adding data flows that follows Update sh1 Pass 4 and then finally update the table through an OLE DB Command transformation. then you do not have to store anything in memory. but
    you might probably end up with a complex data flow transformation.
    The design will depend on the volume of records, the number of transformations you want the data to go through.
    You can avoid transformations by writing queries that retrieve results in the way you want. Example Sorting the results by adding an Order by in your Source query, as opposed to adding a sort transformation
    Regards, Dinesh

  • Read from sql task and send to data flow task - [OLE DB Source [1]] Error: A rowset based on the SQL command was not returned by the OLE DB provider.

    I have created a execut sql task -
    In that, i have a created a 'empidvar' variable of string type and put sqlstatement = 'select distinct empid from emp'
    Resultset=resultname=0 and variablename=empidvar
    I have added data flow task of ole db type and I put this sql statement under sql command - exec emp_sp @empidvar=?
    I am getting an error.
    [OLE DB Source [1]] Error: A rowset based on the SQL command was not returned by the OLE DB provider.
    [SSIS.Pipeline] Error: component "OLE DB Source" (1) failed the pre-execute phase and returned error code 0xC02092B4.

    shouldnt setting be Result
    Set=Full Resultset as your query returns a resultset? also i think variable to be mapped should be of object type.
    Then for data flow task also you need to put it inside a ForEachLoop based on ADO.NET recordset and map your earlier variable inside it so as to iterate for every value the sql task returns.
    Also if using SP in oledb source make sure you read this
    http://consultingblogs.emc.com/jamiethomson/archive/2006/12/20/SSIS_3A00_-Using-stored-procedures-inside-an-OLE-DB-Source-component.aspx
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • How can I fix consistent TCP timeout and make data flow simpler?

    Hi!
    I'm acquiring data from a Scanivalve Digital Scanning Array through a TCP/IP connection. I'm having problems with the connection timing out regularly. It will run fine if you take data several times in a row, but if the VI sits for several minutes (while looking at previous data runs or adjusting test setup) the connection will timeout the next time you try to take data. The subVIs (reading the tcp connection, processing the data pakcet, etc) were provided by the manufacturer several years ago. The midlevel VI were written by another engineer, and I adapted them to run with the top level VI that I needed for this test. The data flow in the VIs are convoluted and while I can follow the ones t
    hat I wrote, I'm not sure how to troubleshoot the others.
    I posted a question on an unrelated problem and the responses mentioned that race conditions were going to be a problem b/c of all of the global variables.
    Please be patient with my lack of knowledge in some areas. Any help would be appreciated.
    Thanks so much!
    -Sarah
    Engineering intern
    Techsburg, Inc.
    Attachments:
    DSA_Acquistion_VIs.zip ‏253 KB

    I'm not positive about the best solution in this situation, but there is lots of information available regarding error 56 when using TCP/IP communication.
    You might find some of these useful:
    Error 56 Occurred at TCP Open: Windows XP Fails as TCP/IP Server with LabVIEW 6.1.
    Error 56 Occurs When Using TCP Listen.vi
    TCP/IP Error Codes and Related Time-out Issues in LabVIEW

  • Moving images/catalogue from daughter to main catalog - Question and help requested on flow

    All,
    I have been using LR since 1.0 and now on 2.5 on a PC (Vista & Windows 7)
    My picture management work-flow
    I pull-in photos onto laptop which has a small catalog of 2009 images and usually only 6000 in (Nefs D300/D70) (Daughter catalog)
    On laptop I keyword etc and prepare photos.
    Once my hard drive (250GB) gets about 2/3 full I move 10-20 folders (I store images by shooting date) to a "master" hard drive (1TB) using the freeware "Syncback".
    I back up catalog at this point.
    Also moving the images to the 1TB means those photos are now going into my Main catalog, and I export the catalog of images that I transfered above and import that intot he main catalog that sort of works
    Now question(s) come(s)
    I lose in the 2009 catalog the link to the images once they have been moved by sync back to the 1TB driveand have to manually reconnect within lightroom.
    I am sure there is a better way yo do this, is there and can anyone help me -
    Can I drag and drop in the library mode in the "folder LH view" the images from the local (c) drive to the 1TB drive.  I haven't tried this (I'm a coward)
    If I eschew the use of Sync back and then export the catalog and images from the Daughter catalog, how can I make sure when I import them into the Main catalog that the images are stored on the 1TB drive correctly (Images/yyyy-mm-dd)?
    Is this handled through the image import dialog when moving the exported catalog into the Main one ?
    I hope this is understandable, as I say I have been a little timid with these steps
    Many many thanks in advance for help and pointers
    Simon Leppard

    When you import a catalog, the Import Dialog will come up. Click copy to new location and Import. Lightroom will import using the existing folder structure as is.

  • Help me in my data flow ... new to Bi 7.0

    Dear friends,
    Iam new to data flow in Bi7.o. Iam loading  data into infocube from flat files which has 7 records. i loaded the data into infocube with 7 records .. then i added 2 records to my flat file when i load it .. iam getting 9 records in my PSA but 16 records in infocube..   i donot knw how to do it..  what setting i should  do  at data source maintaintence...plz help me in understanding the data flow for Bi7.o though i am studying help files..
    Regards,
    pavan

    Hello Pava,
    1. The InfoPackage are the same
    2. The processing type for the DTP must be DELTA
    --> The source for the DTP is in this case the DataSource (PSA).
    --> So u need 3 processes: Delta Infopackage from FF --> to PSA, detla DTP from PSA to DSO --> detla DTP from DSP to InfoCube
    3. "only get delta once" means, the source request is extracted via delta DTP from the source to the target. If you delete the target request from the InfoCube or DSO, the related source request will NOT transfered to the target again with the next delta upload. Usually it will.
    4. "get data by request" means, the delta DTP uses the same number of requests as in the source. Usually the DTP collects the data from the source and creates a new Request that could include several source requests. With this flag the DTP request uses the source requests to transfer the data into the target. So the nb. of requests are the same in the source and in the target.
    I hope this helps,
    Michael

  • Help Required Regarding - SAP Job names using R3 data flows

    We are calling a set of SAP Jobs using R3 data flows in data services. When ever a job fails we first kill the active SAP jobs by logging into SAP and then restarting the Jobs.
    There are about 100 odd SAP jobs that we call using these Data services Jobs so we wanted to kill the jobs using a reusable code on the SAP side by passing the Job name just before every R3 flows just incase its still in active status.
    So wanted to know if there are any short cuts to retrive the set of associated SAP job names because it will be a tedious process to hardcode the SAP job names and pass them as parameters for all the 100 + SAP job names in the custom defined resuable code.
    Any help or advice on this please !!

    The program is not meeting the expectations
    and the problem is due to reflection.Do we know this for certain?
    ... my application gets the class name, field name
    etc. from an XML file so i don't know their method names
    beforehand .
    Now since every class instance corresponds to a row
    in the database and i have to call get and set
    methods of each class instance so the performance
    keeps on degrading as the number of columns and rows increase .
    Can somebody suggest some improvement regarding this
    and regarding creating multiple instances of the same object Class.forName() will be using a hash already, so there is probably not much room for improvement.
    Class.newInstance() probably does not take significantly more processing than a simple "new Fubar();".
    Umpteen reflective method invokations (one per column) for each row/instance - Are you saying these are the problem?
    You can test this easy enough.
    If you comment out the reflective method invocations and leave the rest of your code untouched,
    does your application processing speed up significantly?

Maybe you are looking for

  • PSE 9 Slide Show pixelated

    I have created a slide show in PSE 9 with sharp JPEGs of @ 3MG each. When viewing the photos in full screen, they are badly pixelated. Any ideas?   Thanks.

  • Internet explorer will not load

    my internet explorer will not load, i just get a spinning beachball and could anyone have any suggestions as my dvd i purchased mastering os x panther requires ie. thank you gb foley. emac   Mac OS X (10.3.9)  

  • Re:Migrate.

    Hi , Current production EBS 11.5.9 CU2 is running on AIX 5.2. We want to migrate and upgrade to EBS 12.1 on Linux. The database is on release 9.2.0.3 The plan is to do a clean Rapid Install on Linux with EBS 12.1. What is the best way to migrate. Can

  • Cross Dissolves: quick way to add just ONE!?

    So i've updated to the newest update of final cut x, but i'm still hung up on adding cross dissolves i have two clips on my timeline, i want to add a cross dissolve between the two cuts. so i right click between the two and choose 'add cross dissolve

  • Transferring iBook info to iMac

    Hi, I have an iBook G3 and I'm planning on selling it. It is running OSX 10.2.8. I would like to transfer it over to my iMac. I know that it's possible to do this somehow via the firewire cable connection? My iMac is running OSX 10.4.8. Could anyone