A Very Big Challenge !!!!!!!!

hi,all
Assume that i have a few tables like:
USER : user_id ,user_name, ac_id
PROVINCE: pro_id,pro_name
CITY : cit_id,cit_name,pro_id
BANK :bk_id ,bk_name,cit_id
ACCOUNT:ac_id,ac_number,bk_id
I want to create a page which is used for users to recharge money in his own user_id.Now if i want the page have the below item like:
1、user_id []---------------this should be read and displayed automatically after the user has sucessfully log in(according to the USER table)
2、user_name []----------this should be read and displayed automatically too account to the USER table
3、bank_province[]---about this item,the user himself can select which province the bank is located(this page can read the values from PROVINCE table)
4、bank_city[]--------------about this item,the user can select which city the bank is located.(it has a select list too,which is according to the CITY table )
5、bank_name[]-----------about this item,it should be displayed automatically which kind of bank belongs to.
6、bank_account[]-------it can be displayed atomatically or manually operated by the user himself.
7、bank_account_password--this should be typed by the user himself,and this item is certainly associated with the bank_account item.
8、money_recharge[]-----this item is used for user to recharge money .it should be filled by the uesr himself.
9、user_account_balance[]--------this item shows how much money (the previous money + money_recharge)the user had in his user_id finally.
at last,there should be two buttons at the buttom of the page to submit the above data to the database.
best regards
hlee
THANKS FOR YOUR PATIENCE TO READ AND ANSWERS MY QUESTIONS,THXXXXXXXXXXXX!!!!!!!!!!!!!!!!!!!
Edited by: hlee on 2011-3-17 下午11:25

Which version of Apex are you using?
Generally reading through your questions, I suggest you take a look at the various demos and some of the basic apex documentation.
[You can start here|http://www.oracle.com/technetwork/developer-tools/apex/learnmore/index.html]
I'll get you started with 1 and 2.
+1.user_id []---------------this should be read and displayed automatically after the user has sucessfully log in(according to the USER table)+
+2.user_name []----------this should be read and displayed automatically too account to the USER table+
For 1 and 2, you can create a process which will take place after login on page 101 to use that user name to go to your user table to fetch the user id. You can put the user id in an application item.
Cheers
Kofi

Similar Messages

  • What is the easiest way to create and manage very big forms?

    I need to create a form that will contain few hundred questions. Could you please give me some advise on what is the easiest way to do that? I mean for example is it easier to create everything in Word (since it is easier to manage) and than create a form based on that?
    My concern is that when I will have a very big form, containing different kinds of questions and with many scripts, managing it during work will be slow and difficult, for example adding a question in the middle of the form which would require moving half of the questions down which could smash the layout etc.
    What is the best practise for that?
    Thanks in advance

    Try using Table and Rows for this kind of forms. These forms will have the same look throught with a question and and answer section..
    In the future if you want to add a new section, you can simply add rows in between..
    Thanks
    Srini

  • Improve the performance in stored procedure using sql server 2008 - esp where clause in very big table - Urgent

    Hi,
    I am looking for inputs in tuning stored procedure using sql server 2008. l am new to performance tuning in sql,plsql and oracle. currently facing issue in stored procedure - need to increase the performance by code optmization/filtering the records using where clause in larger table., the requirement is Stored procedure generate Audit Report which is accessed by approx. 10 Admin Users typically 2-3 times a day by each Admin users.
    It has got CTE ( common table expression ) which is referred 2  time within SP. This CTE is very big and fetches records from several tables without where clause. This causes several records to be fetched from DB and then needed processing. This stored procedure is running in pre prod server which has 6gb of memory and built on virtual server and the same proc ran good in prod server which has 64gb of ram with physical server (40sec). and the execution time in pre prod is 1min 9seconds which needs to be reduced upto 10secs or so will be the solution. and also the exec time differs from time to time. sometimes it is 50sec and sometimes 1min 9seconds..
    Pl provide what is the best option/practise to use where clause to filter the records and tool to be used to tune the procedure like execution plan, sql profiler?? I am using toad for sqlserver 5.7. Here I see execution plan tab available while running the SP. but when i run it throws an error. Pl help and provide inputs.
    Thanks,
    Viji

    You've asked a SQL Server question in an Oracle forum.  I'm expecting that this will get locked momentarily when a moderator drops by.
    Microsoft has its own forums for SQL Server, you'll have more luck over there.  When you do go there, however, you'll almost certainly get more help if you can pare down the problem (or at least better explain what your code is doing).  Very few people want to read hundreds of lines of code, guess what's it's supposed to do, guess what is slow, and then guess at how to improve things.  Posting query plans, the results of profiling, cutting out any code that is unnecessary to the performance problem, etc. will get you much better answers.
    Justin

  • Database Log File becomes very big, What's the best practice to handle it?

    The log of my production Database is getting very big, and the harddisk is almost full, I am pretty new to SAP, but familiar with SQL Server, if anybody can give me advice on what's the best practice to handle this issue.
    Should I Shrink the Database?
    I know increase hard disk is need for long term .
    Thanks in advance.

    Hi Finke,
    Usually the log file fills up and grow huge, due to not having regular transaction log backups. If you database is in FULL recovery mode, every transaction is logged in Transaction file, and it gets cleared when you take a log backup. If it is a production system and if you don't have regular transaction log backups, the problem is just sitting there to explode, when you need a point in time restore. Please check you backup/restore strategy.
    Follow these steps to get transactional file back in normal shape:
    1.) Take a transactional backup.
    2.) shrink log file. ( DBCC shrinkfile('logfilename',10240)
          The above command will shrink the file to 10 GB.(recommended size for high transactional systems)
    >
    Finke Xie wrote:
    > Should I Shrink the Database? .
    "NEVER SHRINK DATA FILES", shrink only log file
    3.) Schedule log backups every 15 minutes.
    Thanks
    Mush

  • Optimize delete in a very big database table

    Hi,
    For delete entries in database table i use instruction:
    Delete from <table> where <zone> = 'X'.
    The delete take seven hours (the table is very big and  <zone> isn't an index)
    How can i optimize for reduce the delete time.
    Thanks in advance for your response.
    Regards.

    what is the size of the table and how many lines are you going to delete?
    I would recommend you to delete only up to 5000 or 10000 records in one step.
    do 100 times.
    select *
              from
              into table itab.
              where
              up to 10.000 records.
    if ( itab is initial )
      exit.
    endif.
    delete ... from table itab.
    commit work.
    If this is still too slow, than you should create a secondary index with zone.
    You can drop the index after the deletion is finished.
    Siegfried

  • Is this beta? The image for Yosemite is a very big rock, appropriate as this OS sends your mac back to the stone-age. iTunes crashes on open, iPhoto won't download, movies won't play, Safari won't show graphics, wifi down to speed of crippled snail.

    Is this beta. The image for Yosemite is a very big rock, appropriate as this OS sends your mac back to the stone-age. iTunes crashes on open, iPhoto won't update, Safari shows text without graphics, can't get Java 8 'cause Safari "can't find server". Wifi slower than a crippled snail. Does anyone know how to uninstall this rubbish?

    travellingbirder wrote:
    Is this beta ?
    You tell us Finding the OS X version and build information on your Mac.

  • How do i open a VERY big file?

    I hope someone can help.
    I did some testing using a LeCroy LT342 in segment mode. Using the
    Labview driver i downloaded the data over GPIB and saved it to a
    spreadsheet file. Unfortunately it created very big files (ranging from
    200MB to 600MB). I now need to process them but Labview doesn't like
    them. I would be very happy to split the files into an individual file
    for each row (i can do this quite easily) but labview just sits there
    when i try to open the file.
    I don't know enough about computers and memory (my spec is 1.8GHz
    Pentium 4, 384MB RAM) to figure out whether if i just leave it for long
    enough it will do the job or not.
    Has anyone any experience or help they could offer?
    Thanks,
    Phil

    When you open (and read) a file you usually move it from your hard disk (permanent storage) to ram.  This allows you to manipulate it in high speeds using fast RAM memory, if you don't have enough memory (RAM) to read the whole file,  you will be forced to use virtual memory (uses swap space on the HD as "virtual" RAM) which is very slow.  Since you only have 384 MB of RAM and want to process Huge files (200MB-600MB) you could easily and inexpensively upgrade to 1GB of RAM and see large speed increases.  A better option is to lode the file in chunks looking at some number of lines at a time and processing this amount of data and repeat until the file is complete, this will be more programming but will allow you to use much lass RAM at any instance.
    Paul
    Paul Falkenstein
    Coleman Technologies Inc.
    CLA, CPI, AIA-Vision
    Labview 4.0- 2013, RT, Vision, FPGA

  • Why does my Hotmail site appears very BIG in Firefox?

    Suddenly the whole site was VERY big; the letters, symbols: everything. I have not done anything myself for to change the site.
    What can I do to change it back to "normal"?

    See this:
    https://support.mozilla.com/en-US/kb/Page+Zoom

  • Print very big JTable

    Hi all,
    i have to print a very big table with 100000 rows and 6 columns, i have put the System.gc() at the end of the print method but when i print the table the print process become too big (more or less 700 kB for page and there are 1048 pages).
    It is possible to make a pdf of my table and this solution is better like the first?
    When i make the preview this take a lot of time for the size of the table, because first i have to create the table and then i preview it.
    There is a way to reduce the time lost for the table generation?
    N.B.: the data in the table is always the same.
    Thanks a lot!!!

    There is a way to reduce the time lost for the table
    generation? Write a table model, extending AbstractTableModel.
    The model is queried for each cell. Usually all the columns
    of one row are retrieved before getting next row. You may cache
    one row in the model: not the whole table!

  • Very Big Cell when export in Excel

    Dear Tech guys,
    I use VS 2008 with VB and CR 2008.
    Crystal report, and export in PDF are OK, but when i export the report in Excel, i have the bellow problems.
    The report is a delivery note with 7 columns and many rows.
    1. In all pages, the page numbers are lost, except from the last page.
    2. After last row, excel has a very big in height (height > 300) row. Because of this, excel creates a new empty page and at the bottom of the new page i see the page number (Page 5 of 5).
    Can you help me with this problem??
    I have this problem after the last update (Service Pack 3).
    Visual Studio 2008: 9.030729.1 SP 1
    Crystal Reports 2008: 12.3.0.601
    Thank you in advance.
    Best regards,
    Navarino Technology Dept.
    Edited by: Navarino on Jul 15, 2010 2:47 PM

    Dear all good morning from Greece.
    First of all, i like to thank you for your quick respond.
    Dear Ludek Uher,
    1. Yes, this is from a .NET (3.5) application with VB.
    2. I do the export via code.
    3. From the CR designer i have the same problem.
    Dear David Hilton,
    The photo, is not working.
    I found the option "On Each Page" from the CR designer and i changed it. Now i get the page number on every page but i can see that something is wrong with the page height and with the end of the page of the report.
    I will try to show you the problem of the Excel file, after the option "On Each Page":
    Header........................
                      Field1     field2    field3 ......
    row1 .......
    row2 ......
    row3.....
    row56 ......
    {end of page 1}
    {new page 2}
    row57
    row58
    row59
    Page 1 of 4 (the footer of the first page must be in the first page, but it shown in the second page)
    row60
    row61
    row62
    {end of page 2}
    {new page 3}
    row110
    row111
    row112
    Page 2 of 4 (the footer of the second page must be in the second page, but it shown in the third page)
    row140
    row141
    row142
    {end of page 3}
    {new page 4}
    and go on.....
    I hope this helped.
    If i change the margins from Page Break Preview in Excel, the pages are OK. So i thing that something conflicts with the end of the page. The report does not understand the end of the page.
    If there is a way to send you the file or screen shots, please tell me how.
    Thank you in advance again.
    Best regards,
    Navarino Technology Dept.
    Edited by: Navarino on Jul 16, 2010 9:09 AM

  • VERY big files (!!) created by QuarkXPress 7

    Hi there!
    I have a "problem" with QuarkXPress 7.3 and I don't know if this is the right forum to ask...
    Anyway, I have createed a document, about 750 pages, with 1000 pictures placed in it. I have divided it in 3 layouts.
    I'm saving the file and the file created is 1,20GB !!!
    Isn't that a very big file for QuarkXPress??
    In that project there are 3 layouts. I tried to make a copy of that file and delete 2 of 3 layouts and the project's file size is still the same!!
    (Last year, I had created (almost) the same document and as I checked that document now, its size is about 280 MB!!)
    The problem is that I have "autosave" on (every 5 or 10 minutes) and it takes some time to save it !
    Can anyone help me with that??
    Why does Quark has made SO big file???
    Thank you all for your time!

    This is really a Quark issue and better asked in their forum areas. However, have you tried to do a Save As and see how big the resultant document is?

  • Very big problem with JSF about FORM and "id=" for HTML form's elements and

    I have discovered a very big problem with JSF about FORM and "id=" for HTML form's elements and java instruction "request.getParameterNames()".
    Suppose you have something like this, to render some datas form a Java Beans :
    <h:dataTable value="#{TablesDb2Bean.myDataDb2ListSelection}" var="current" border="2" width="50%" cellpadding="2" cellspacing="2" style="text-align: center">
    <h:column>
    <f:facet name="header">
    <h:outputText value="Name"/>
    </f:facet>
    <h:outputText id="nameTableDb2" value="#{current.db2_name_table}"/>
    </h:column>
    </h:dataTable>
    Everything works fine...
    Suppose you want to get the name/value pairs for id="nameTableDb2" and #{current.db2_name_table} to process them in a servlet. Here is the HTML generated :
    <td><span <span class="attribute-name">id=<span class="attribute-value">"j_id_jsp_1715189495_22:0:nameTableDb2">my-table-db2-xxxxx</span></td>
    You think you can use the java instructions :
    Enumeration NamesParam = request.getParameterNames();
    while (NomsParam.hasMoreElements()) {
    String NameParam = (String) NamesParam.nextElement();
    out.println("<h4>"++NameParam+ "+</h4>);
    YOU ARE WRONG : request.getParameterNames() wants the syntax *name="nameTableDb2" but JSF must use id="nameTableDb2" for "<h:outputText"... So, you can't process datas in a FORM generated with JSF in a Servlet ! Perhaps I have made an error, but really, I wonder which ?
    Edited by: ungars on Jul 18, 2010 12:43 AM
    Edited by: ungars on Jul 18, 2010 12:45 AM

    While I certainly appreciate ejb's helpful responses, this thread shows up a difference in perspective between how I read the forum and how others do. Author ejb is correct in advising you to stay inside JSF for form processing if form processing is what you want to do.
    However, I detect another aspect to this post which reminds me of something Marc Andreesen once said when he was trying to get Netscape off the ground: "there's no such thing as bad HTML."
    In this case, I interpret ungar's request as a new feature request. Can I phrase it like this?
    "Wouldn't it be nice if I could render my nice form with JSF but, in certain cases, when I REALLY know what I'm doing" just post out to a separate servlet? I know that in this case I'll be missing out on all the nice validation, conversion, l10n, i18n, ajax, portlet and other features provided by JSF".
    If this is the case, because it really misses the point of JSF, we don't allow it, but we do have an issue filed for it
    https://javaserverfaces-spec-public.dev.java.net/issues/show_bug.cgi?id=127
    If you can't wait for it to be fixed, you could decorate the FormRenderer to fix what you want.
    I have an example in my JSF book that shows how to do this decoration. http://bit.ly/edburnsjsf2
    Ed

  • Very Big Table (36 Indexes, 1000000 Records)

    Hi
    I have a very big table (76 columns, 1000000 records), these 76 columns include 36 foreign key columns , each FK has an index on the table, and only one of these FK columns has a value at the same time while all other FK have NULL value. All these FK columns are of type NUMBER(20,0).
    I am facing performance problem which I want to resolve taking in consideration that this table is used with DML (Insert,Update,Delete) along with Query (Select) operations, all these operations and queries are done daily. I want to improve this table performance , and I am facing these scenarios:
    1- Replace all these 36 FK columns with 2 columns (ID, TABLE_NAME) (ID for master table ID value, and TABLE_NAME for master table name) and create only one index on these 2 columns.
    2- partition the table using its YEAR column, keep all FK columns but drop all indexes on these columns.
    3- partition the table using its YEAR column, and drop all FK columns, create (ID,TABLE_NAME) columns, and create index on (TABLE_NAME,YEAR) columns.
    Which way has more efficiency?
    Do I have to take "master-detail" relations in mind when building Forms on this table?
    Are there any other suggestions?
    I am using Oracle 8.1.7 database.
    Please Help.

    Hi everybody
    I would like to thank you for your cooperation and I will try to answer your questions, but please note that I am a developer in the first place and I am new to oracle database administration, so please forgive me if I did any mistakes.
    Q: Have you gathered statistics on the tables in your database?
    A: No I did not. And if I must do it, must I do it for all database tables or only for this big table?
    Q:Actually tracing the session with 10046 level 8 will give some clear idea on where your query is waiting.
    A: Actually I do not know what you mean by "10046 level 8".
    Q: what OS and what kind of server (hardware) are you using
    A: I am using Windows2000 Server operating system, my server has 2 Intel XEON 500MHz + 2.5GB RAM + 4 * 36GB Hard Disks(on RAID 5 controller).
    Q: how many concurrent user do you have an how many transactions per hour
    A: I have 40 concurrent users, and an average 100 transaction per hour, but the peak can goes to 1000 transaction per hour.
    Q: How fast should your queries be executed
    A: I want the queries be executed in about 10 to 15 seconds, or else every body here will complain. Please note that because of this table is highly used, there is a very good chance to 2 or more transaction to exist at the same time, one of them perform query, and the other perform DML operation. Some of these queries are used in reports, and it can be long query(ex. retrieve the summary of 50000 records).
    Q:please show use the explain plan of these queries
    A: If I understand your question, you ask me to show you the explain plan of those queries, well, first, I do not know how , an second, I think it is a big question because I can not collect all kind of queries that have been written on this table (some of them exist in server packages, and the others performed by Forms or Reports).

  • Question about reading a very big file into a buffer.

    Hi, everyone!
    I want to randomly load several characters from
    GB2312 charset to form a string.
    I have two questions:
    1. Where can I find the charset table file? I have used
    google for hours to search but failed to find GB2312 charset
    file out.
    2. I think the charset table file is very big and I doubted
    whether I can loaded it into a String or StringBuffer? Anyone
    have some solutions? How to load a very big file and randomly
    select several characters from it?
    Have I made myself understood?
    Thanks in advance,
    George

    The following can give the correspondence between GB2312 encoded byte arrays and characters (in hexadecimal integer expression).
    import java.nio.charset.*;
    import java.io.*;
    public class GBs{
    static String convert() throws java.io.UnsupportedEncodingException{
    StringBuffer buffer = new StringBuffer();
    String l_separator = System.getProperty("line.separator");
    Charset chset = Charset.forName("EUC_CN");// GB2312 is an alias of this encoding
    CharsetEncoder encoder = chset.newEncoder();
    int[] indices = new int[Character.MAX_VALUE+1];
    for(int j=0;j<indices.length;j++){
           indices[j]=0;
    for(int j=0;j<=Character.MAX_VALUE;j++){
        if(encoder.canEncode((char)j)) indices[j]=1;
    byte[] encoded;
    String data;
    for(int j=0;j<indices.length;j++) {
         if(indices[j]==1) {
                encoded =(Character.toString((char)j)).getBytes("EUC_CN");
                          for(int q=0;q<encoded.length;q++){
                          buffer.append(Byte.toString(encoded[q]));
                          buffer.append(" ");
                buffer.append(": 0x");buffer.append(Integer.toHexString(j));
                buffer.append(l_separator);
        return buffer.toString();
    //the following is for testing
    /*public static void main(String[] args) throws java.lang.Exception{
       String str = GBs.convert();
       System.out.println(str);*/

  • We have a very big issue here, Patch Tuesday didn't work correctly yesterday, and the patches took all the free drive space away.

    Hello
    We have a very big issue here, Patch Tuesday didn't work correctly yesterday, and the patches took all the free drive space away.
    Before we started with the Patches, we had one Gig of free space on our hard drive, now we are down to less than 10 Megs!!!
    What happened is that the patches didn't work, we had to try them 7 times, and every time they created a 'temporary restore point' which we don't know where that point is.
    The server has crashed twice since, we were able to place it back online but we're afraid it will crash again.
    We have several customers on that server, we are very worried,
    Can you please help us?
    Thank you

    Hi,
    Hope the below article be helpful:
    Clear space on your hard drive by deleting old Windows uninstall folders
    http://news.cnet.com/8301-13880_3-9862223-68.html
    Please Note: Since the web site is not hosted by Microsoft, the link may change without notice. Microsoft does not guarantee the accuracy of this information.
    And as far as I know, we could not use diskpart to extend C drive, 3rd party utilities may work, please go through the below thread:
    http://social.technet.microsoft.com/Forums/windowsserver/en-US/03068ecb-7edd-43d8-bc68-12bc466c583f/how-to-extend-the-windows-2003-c-drive?forum=winservergen
    Regards,
    Yan Li
    Regards, Yan Li

Maybe you are looking for

  • SQL SERVER 2014 - Project consistency check failed. The following inconsistencies were detected: package.dtsx. has a different ProtectionLevel than the project.

    I am getting the following error when I right click on a package in solution explorer and execute using SQL Server 2014: Project consistency check failed. The following inconsistencies were detected: package.dtsx. has a different ProtectionLevel than

  • Problem with RFC-Call within BPM

    Hello Experts, I am developing a scenario in which I call an rfc-function-module for saving information from an incoming xml which will be transmitted as Idoc to another SAP System. The Idoc transmission is doing properly. But the RFC-Call I try to d

  • Proglem with JTable column width

    Hi Friends, I'm using jtable. For that table, i want to set the column width. But it is not working. Here is my code... class ProcessTable extends JTable { ProcessTable(){ getTableHeader().setResizingAllowed(true); setAutoResizeMode(JTable.AUTO_RESIZ

  • Old Statistics

    Dear All Guru I m using oracle 10.2.0.3 database on window environment. I need two month before statistics because Last month my assistant analyzed tables that make the database performance slow. Today i find out the problem. Please tell me how i imp

  • Can't See User Portlet

    Running 9iAS R2. When I log into portal as administrator the user portlet does not appear even the when I click on edit it shows as available. Portal user profile, Edit/Delete Group, and Portal Group Profile portlets all appear. How do I create a new