Webforms - data handling and storage

Hello,
I would like to know what is the best recommended practice / best apps methodology for the following scenario :
A Custom web form with options to enter data, add attachments and submit. On submission , the workflow gets triggered, followed by approvals and finally data update into the relevant transaction.
My query :
- > For saving the form data and attachments , the options which I am aware are :
1 )  tables
2) case management, into the transaction using GOS.
3) DMS
4 ) Workflow container
5) Business document services
Any more to add to this list would be really good.
In such scenario , what are the best possible way to be used ?
Would be good to hear your thoughts on this.
Thanks alot
Saujanya

HI,
In one of my projects we also had a similar kind of scenario the below was teh approach we tried and it worked.
1. Store all the relevant data fields Header and Item level data in the table and also maintained relation ship between these two tables.
2. Now when it comes to attachment first we created a unique guid for each attachment and the relevant document we tried to store in the DMS system. so when ever we require teh document then we use to rfer the GUID which was generated.
3.AS usual when he clicks on subit we tried to create a custom business class and rasied the event through SAP_WAPI_CREATE_EVENT.
4. If attachment is required in the workflow then we use to refer in tables.
May be it might help you to start...... Have a great day
Regards
Pavan

Similar Messages

  • Date Wise and Storage Location Wise Stock Qty & Value Report......

    Hi Experts,
    We want a report Date Wise(As on 31.03.2008) and Storage Location Wise Quantity & Value Report for only Finish Materials. Is there any report ?
    From Mb5b we canot get storage location wise report as we get only plant level qty and value.
    Pl. guide us.
    Regards,
    Yusuf

    Hi Yusuf,
        Try the Tcode: MC.9 there enter the site and article and executeYou will get details of the article
    stock, value. if you double click the article you will get the details of storage location.
    Hope it will be help for you
    Regards
    GK

  • Azure Site Recovery to Azure - cost for data transfer and storage

    Hello,
    I send you this message on behalf of a small firm in Greece interested to implement Azure Site Recovery to Azure.
    We have one VM (Windows 2008 R2 Small Business Server) with 2 VHDs (100GB VHD for OS and 550GB VHD for Data) on a Windows 2012 server Std Edition.
    I would like to ask you a few questions about the cost of the data transfer and the storage 
    First: About the initial replication of the VHDs to Azure. It will be 650GBs. Is it free as inbound traffic? If not the Azure Pricing calculator shows about 57€. But there is also the import/export option which costs about the same:
    https://azure.microsoft.com/en-us/pricing/details/storage-import-export/
    What would be the best solution for our case? Please advice.
    Second: What kind of storage is required for the VHDs fo the VM (650GBs). My guess is Blob storage. For this storage locally redundant, the cost will be about 12-13€/month. Please verify.
    Third: Is the bandwidth for the replication of our VM to Azure free?
    That's all for now.
    Thank you in advance.
    Kind regards
    Harry Arsenidis 

    Hi Harry,
    1st question response: ASR doesn't support Storage Import/Export for seeding the initial replication storage. ASR pricing can be found
    here which details about 100GB of Azure replication & storage per VM is included with the purchase of the ASR to Azure subscription SKU through the Microsoft Enterprise Agreement. 
    Data transfer pricing
    here  indicates that inbound data transfers are free.
    As of now only option will be online replication. What is the current current network link type & bandwidth to Azure? Can you vote for the feature & update requirements here?
    2nd question response: A storage account with geo-redundancy is required. But as mentioned earlier with Microsoft Enterprise Agreement you will get 100GB of Azure replication & storage per VM included with ASR. 
    3rd question response: Covered as part earlier queries.
    Regards, Anoob

  • Data class and storage class

    Hi ,
    I need some info or docs on data class and stroge class of DSO /cube.
    Regards
    tapashi

    Hi Tapashi,
    Something i found about dataclass in DSO/Cube, which might be useful to you.
    Within SAP BW following data class of DDIC objects are important:
    DDIM           Dimension Tables in BW of InfoCubes
    DFACT          Facts Table in BW of InfoCubes
    DODS           ODS Tables in BW
    These have been introduced in order to improve performance while reading/writing InfoProviders. Settings of data class are maintained in "Technical Settings -> Database storage parameters" screen of TA SE11. Data class is assigned to the database tables of the InfoCube (table RSDCUBE, RSDODSO). Notice that this assignment cannot be made by any circumstances by user, only system does this while you activate InfoProvider.
    Subsequently see overview of table RSDCUBEu2019s fields with link to data class according BW versions:
    SAP BW 3.x (parameters only affect aggregates, not the cube):
    AGGRDATCLS     Data class for aggregate fact tables (only aggregates)
    AGGRSIZCAT     Size category for aggregate fact tables
    ADIMDATCLS     Data class for aggregate dimension tables
    ADIMSIZCAT     Size category for aggregate dimension tables
    Furthermore see overview of RSDODSOu2019s fields for DSO objects as InfoProvider with link to data class:
    ODSADATCLS     Data class for table with active data of the ODS
    ODSMDATCLS     Data class for table with ODS input data
    To see all available data classes check table: DDART (DD: Data Class in Technical Settings)
    To see all available size categories check table: DGKAT (DD: Size category in technical settings)
    Hope this is helpful.
    Regards
    Snehith

  • Data handling and graphs

    I originally wrote the graph program to handle the data in a text file that was organized like this...
    vertex a
    vertex b
    vertex c
    vertex d
    edge a c
    edge a d
    edge d b
    and now I have to change the main to accept a datafile containing...
    a b
    b c
    c e
    d g
    g c
    Now, here is a copy of the main program as it currently stands...
    import java.io.*;
    import java.util.*;
    public class TopSort
         static Graph theGraph = new Graph();
         static Hashtable hashList = new Hashtable();  //just to store array index
         public static void main (String args[])
                 MyInfoachaffin myInfo = new MyInfoachaffin();
              myInfo.info();     
                 File sourceFile = new File(args[0]); // access the file
                 if (sourceFile.exists() == false)                                          
                   System.err.println("Oops: " + sourceFile + ": No such file");
                   System.exit(1);
                 String newVertex, startEdge, endEdge;
                 int arrayPosition = -1;
                 try //open the file
                           FileReader fr = new FileReader(sourceFile);
                           BufferedReader br = new BufferedReader(fr);
                        String input;
                        while ((input = br.readLine()) != null)
                             StringTokenizer toke = new StringTokenizer(input);
                                while (toke.hasMoreTokens())
                                  if (hashList.containsValue(toke))
                                               return;
                                     else
                                          newVertex = toke.nextToken(); //get vertex
                                          theGraph.addVertex(newVertex); //add into graph
                                          arrayPosition++; //increment counter
                                          hashList.put(newVertex, new Integer(arrayPosition));
                                          //add position with vertex as key
                                     /*else if (toke1.equals("edge"))
                                          startEdge = toke.nextToken(); //get edge
                                          endEdge = toke.nextToken();  //get vertex
                                          Integer temp = ((Integer)hashList.get(startEdge));
                                          int start = temp.intValue(); //find position with key
                                       Integer temp2 = ((Integer)hashList.get(endEdge));
                                          int end = temp2.intValue();  //find position with key
                                          theGraph.addEdge(start, end); //add edge
                                }//close inner while
                       }//close outer while
                       System.out.println("The hashtable contents: " + hashList.entrySet());
                        br.close();
                  }//close try
                      catch (IOException e)
                                System.out.println("Whoops, there's a mistake somewhere.");
                          theGraph.graphSort();
       }//end main
    }//end class I have managed to seperate the vertexes from the edges and stored them in a hashtable so as to be able to remember their location in the array itself. My question is is there a way to go through a file a second time and pull the vertexes from that file or, conversely, should I store the data in the file into, oh I dunno, a linked list or something that I can pull the data from? The graph part of the program works so I didn't add that file and the setting the vertexes works fine too...I'm just stuck on how to handle the edges, exactly, so does anyone have any advice? Right now, the adding edges is commented out because that was how I handled it with the original data...

    Whoa, you're freakin' me out.
    All you gotta do is read in the data in a new format. You can either translate the new format to the old format, or you can write a method that creates objects from the new format explicitly. That's all there is to it.

  • Data block and storage procedure

    Hi,
    I have problem with form that is build on storage procedure.
    When I compile form i get error:
    Compilation error on DELETE-PROCEDURE trigger on BLOCK22 data block:
    ERROR PL/SQL 306 in line 7, column 1
    wrong number or types of arguments in call to 'POPULATE_TABLE'
    ERROR PL/SQL 0 in line 7, kcolumn 1
    Statement ignored
    Table:
    CREATE TABLE LOKALIZACJE
    (     ID_LOK INTEGER,
         ADRES VARCHAR2(40),
         OPIS VARCHAR2(80));
    Package:
    create or replace package pkg_lok_1 is
    type lok_rec is record
    (id_lok lokalizacje.id_lok%type,
    adres lokalizacje.adres%type,
    opis lokalizacje.opis%type);
    TYPE return_cur IS REF CURSOR RETURN lokalizacje%ROWTYPE;
    TYPE return_tab IS TABLE OF lok_rec;
    PROCEDURE wyswietl (param_return_rec IN OUT return_cur);
    PROCEDURE usun (p_emp_table IN OUT return_tab);
    END pkg_lok_1;
    CREATE OR REPLACE PACKAGE BODY pkg_lok_1 AS
    PROCEDURE wyswietl (param_return_rec IN OUT return_cur) IS
    BEGIN
    OPEN param_return_rec FOR
    SELECT * FROM lokalizacje;
    END wyswietl;
    PROCEDURE usun (p_emp_table IN OUT return_tab) is
    BEGIN
    delete lokalizacje
    where id_lok=p_emp_table(0).id_lok;
    END usun;
    END pkg_lok_1;
    end trigger delete-procedure
    DECLARE
    bk_data PKG_LOK_1.RETURN_TAB;
    BEGIN
    PLSQL_TABLE.POPULATE_TABLE(bk_data, 'BLOCK22', PLSQL_TABLE.DELETE_RECORDS);
    PKG_LOK_1.USUN(bk_data);
    END;
    Procedure wyswietl work fine but form doesn't comiple when I use procedure usun.
    Where is problem? Colud someone help me?
    james

    Perhaps, you block have a non database items but they is declared as database.

  • Labview data collection and storage?

    for a complete labview beginner, what is the best way to collect data from hardware connected by rs-422 serial port?
    Solved!
    Go to Solution.

    At the moment i've set up the attached but I can only get it to input one set of values. how can I get it to continuosly output the files? Secondly I would rather out put them to a spreadsheet format, with the five different numbers (three from I/O assistant and two from Date/Time String) in diferent columns? Im trying to use the write spreadsheet VI but it needs an orange connection?
    Cheers
    Attachments:
    first attempt.PNG ‏15 KB

  • 4503/4506 - data handling and data flow

    I have been tasked to find a document that details how, from start to finish, a packet goes through a 4503/4506 unit. I'm talking a level of detail that includes what portion of RAM the packet goes into once it hits the inbound interface, what parts of the switch handle the analysis (ACLs, et all), and so on, right until the packet is either dropped or forwarded to the outgoing int. As detailed a description as possible, and if a non-model-specific equivalent is available and applicable to this unit, that works as well.
    I have been looking through the TechDocs and the like, as well as several attempts at Google (which is well-nigh useless), and no luck thus far.
    Thanks in advance for any information provided.

    I am not aware of any CCO documents explaining path of a packet/CAT4500 architecture. However, there was a presentation on this at Networkers 2005. If you attended it, you can check it out at
    http://www.cisco.com/networkers/nw05/nwol.html
    Here is the session information for RST-4500.
    Session Title: Cisco Catalyst 4500 Switch Architecture
    Length: 2 Hours
    Level: Intermediate
    Related Sessions:
    RST-3031 Troubleshooting LAN Protocols
    RST-3042 Troubleshooting Cisco Catalyst 4000 and 4500 Series Switches
    Abstract: This session presents an in-depth study of the architecture of the Cisco Catalyst 4500 Switches and how the various components work together. The focus for this session is present information to help the audience understand the architecture to be able to design and implement Catalyst 4500 in their network and troubleshoot better.
    Topics include a discussion of the architecture, information about the latest Cisco Catalyst 4500 supervisors and switching modules such as SupevisorV-10GE, Netflow Feature card (NFL), Catalyst 4948, Supervisor II+TS amd PoE linecard, as well as the key features such as CEF/Multicast Forwarding, DHCP Snooping, IP Source Guard, Dynamic ARP inspection, 802.1X, Redundancy (SSO), Netflow, ACL/TCAM, QoS (Per-port/Per-VLAN and UBRL),
    This session is designed for network designers and senior nework operation engineers considering deploying or have Cisco Catalyst 4500 series of switches in enterprise and service provider networks.
    * Prerequisites:
    1. Knowledge of LAN protocols is required
    2. Basic understanding of Cisco Catalyst switches is required.
    Topics include a discussion of the architecture, information about the latest Cisco Catalyst 4500 supervisors and switching modules such as SupevisorV-10GE, Netflow Feature card (NFL), Catalyst 4948, Supervisor II+TS amd PoE linecard, as well as the key features such as CEF/Multicast Forwarding, DHCP Snooping, IP Source Guard, Dynamic ARP inspection, 802.1X, Redundancy (SSO), Netflow, ACL/TCAM, QoS (Per-port/Per-VLAN and UBRL),
    Speakers: Balaji Sivasubramanian
    Escalation Eng
    Cisco Systems
    Balaji Sivasubramanian is an escalation engineer in Cisco's Gigabit Switching Business Unit. Balaji, who is a CCNP, is also co-author of "CCNP Self-Study: Building Cisco Multilayered Switched Network - 2nd Edition" (ISBN -1587051508). Balaji is an expert in Catalyst 4500 switches architecture, and in troubleshoooting LAN protocols and Catalyst switches including the Catalyst 4500, Catalyst 6500 and Catalyst 3500. In his 5+ years with Cisco, Balaji has also held positions of TAC Technical Leader/Expert in LAN/Campus switching, Worldwide Subject Matter Expert in LAN technologies for the Cisco TAC and a TAC support engineer in LAN/Campus switching.

  • Bridge meta data handling and filtering

    It would be great if you could add two features:
    1.  adding keywords to multiple files currently if two files share the same  keywords but not in the order, adding new is impossible without  overwriting the existing keywords
    2. filtering by location data, it would be great to filter by country say, just as by keyword, date or other.
    Thanks, Erik

    For adding keywords without removing existing keywords, this script might be of use to you...
    http://www.ps-scripts.com/bb/viewtopic.php?f=19&t=2364&sid=57a50afd1b53b5fc195f0e5dfdbfab0 6

  • Does I pad usb adapter allow data transfer and storage.

    I hear that it "sneezes" at external hard drives so am wondering if the camera connection kit allows any write or only read?

    Presently the iPad can only import photos and and videos that are placed in a "DCIM" folder in the root direction of a FAT 16 or 32 formated device. It is not clear if the device can be a hard disc. I have seen videos where a hard disc did not work. I had a n old 4 gb flash drive that did not work since it drew too much power (got error message).
    Try doing a Google search

  • SMS forward/group sending/single delete & DATA transfer and storage

    As I´ve learned in my last 8 days as a new 3G owner, that a BASIC FUNCTION function to forward a SMS, to delete a single SMS, to send one SMS to a group (I´m lucky to have my old Siemens Mobile Phone to inform all people, that I have a new number and an iphone ...) or a MMS function are NOT INSTALLED, which I consider a weakness. Also needed are some APP´s to work with a kind of FINDER to transfer some (business) files into the phone. The 3G should be a business solution? I don´t want to send me x emails, to get a attached documents.
    My question now:
    Do you know if there are legal programs at the APPLE APP Store which can help me in daily iphone functions, untill - somewhen - a firmware update is able to cover this?
    I would really appreciate your help in finding these App´s.
    Regards
    <Edited by Moderator>

    "For the OP: you can send a SMS to a group, just hit "+" button to add more numbers. As for the rest of your questions, the answer would be no, but you should definately give apple a feedback on this, they might add it in one of the future updates."
    This is how to send a MASS text... not how to text to a GROUP.
    This is a major flaw with the phone - not being able to select a Group, then text or email them. Not to mention, zero support for multi media in Text.
    I REALLY hope Apple fixes this, but it doesn't look like it will happen.

  • Sales order stock and storage location stock on past date with value

    Hi everyone.
    I m looking for a report which can give sales order stock with value as it is shown in mb5b and storage location stock with value as in mc.5 is there any report which can give the combination of both the things on past date with values.

    can you tell me what shall be the fields and table for special stock on past date with value and sloc stock on past date with value

  • A message will pop up (Exc in ev handl: Error: Bad NPObject as private data!) and the tab I was on will close out and reopen in a new window. Why is it happening and how do I stop it?

    Okay, I will have a window open with 4 tabs open. At first, everything is fine, but after a day or so, a message will pop up on screen saying "'''Exc in ev handl: Error: Bad NPObject as private data!'''" and after you close it out, the tab you were on will close and reopen in it's own window. I then have to shut down all both old and new windows and open a new window with my original 4 tabs again. It will work fine until a few days pass and it starts over again.

    This issue can be caused by the McAfee Site Advisor extension
    *https://support.mozilla.com/kb/Troubleshooting+extensions+and+themes
    Start Firefox in <u>[[Safe Mode]]</u> to check if one of the extensions or if hardware acceleration is causing the problem (switch to the DEFAULT theme: Firefox (Tools) > Add-ons > Appearance/Themes).
    *Don't make any changes on the Safe mode start window.
    *https://support.mozilla.com/kb/Safe+Mode

  • UIX with XSQL as XML data provider and event handler

    Hello ,
    I would like to bind XML data to messageinput elements of a form element
    as values to be presented before entering (data provider)
    as well as input values to be persisted after completing the form (event handler).
    My impression (as a newbee) is that only for BC4J integration there is a bidirectional binding with view objects.
    Can i use 'include' to bind a static xml file as data source for output?
    How can i use XSQL to be bound as data for input as well as for output of a form?
    A last question concerning a page with 3 tabs:
    do i need 3 different pages and requests to get the data of the 3 tabs
    or is it possible to get the whole data of the page in one request
    and distribute it over the 3 tabs.
    Any help appreciated
    Thanks
    Klaus Dreistadt

    You could do this, but we don't provide any tools to make this easy.
    You'd have to write an implement of the DataObject interface
    that gives your UI access to the XML document, and write custom
    event handlers to perform the "set" side of things. The Data Binding
    and UIX Controller chapters of the UIX developer's guide will give you
    a high-level view of how to accomplish this, but nothing specifically
    about reading or writing to XML documents.

  • Can I use an OLE DB Command Task to call a parameterized stored procedure, perform some data editing and pass variables back to the SSIS for handling?

    I am using a Data Flow and an OLE DB Source to read my staged 3rd party external data. I need to do various Lookups to try and determine if I can find the external person in our database...by SSN...By Name and DOB...etc...
    Now I need to do some more data verification based on the Lookup that is successful. Can I do those data edits against our SQL server application database by utilizing an OLE DB Command? Using a Stored Procedure or can I sue straight SQL to perform my edit
    against every staging row by using a parameter driven query? I'm thinking a Stored Procedure is the way to go here since I have multiple edits against the database. Can I pass back the result of those edits via a variable and then continue my SSIS Data Flow
    by analyzing the result of my Stored Procedure? And how would I do that.
    I am new to the SSIS game here so please be kind and as explicit as possible. If you know of any good web sites that walk through how to perform SQL server database edits against external data in SSIS or even a YouTube, please let me know.
    Thanks!

    Thanks for that...but can I do multiple edits in my Stored Procedure Vaibhav and pass back something that I can then utilize in my SSIS? For example...
    One and Only one Member Span...so I'd be doing a SELECT COUNT(*) based on my match criteria or handle the count accordingly in my Stored Procedure and passing something back via the OLE DB Command and handling it appropriately in SSIS
    Are there "Diabetes" claims...again probably by analyzing a SELECT COUNT(*)
    Am I expecting too much from the SSIS...should I be doing all of this in a Stored Procedure? I was hoping to use the SSIS GUI for everything but maybe that's just not possible. Rather use the Stored Procedure to analyze my stged data, edit accordingly, do
    data stores accordingly...especially the data anomalies...and then use the SSIS to control navigation
    Your thoughts........
    Could you maybe clarify the difference between an OLE DB Command on the Data Flow and the Execute SQL Task on the Control Flow...
    You can get return values from oledb comand if you want to pipeline.
    see this link for more details
    http://josef-richberg.squarespace.com/journal/2011/6/30/ssis-oledb-command-and-procedure-output-params.html
    The procedure should have an output parameter defined for that
    I belive if you've flexibility of using stored procedure you may be better off doing this in execute sql task in control flow. Calling sp in data flow will cause it to execute sp once for each row in dataset whereas in controlflow it will go for set based
    processing
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

Maybe you are looking for