Important!!!! question about database with huge volumen

The possibility exists in our BD (in at least a table) of ending up managing volumes of data of the order of 37 thousand million records. Then, we want to know if experience exists in 10g with these data volumes and the recommendations that the experienced ones can give us about it.

37 thousand million records.Why not just say 37 billion records? Because depending where you are in the world 37
thousand million isn't 37 billion. Whilst 1
billion in the U.S. is 1,000,000,000 (one thousand
million), here in Britain 1 billion is
1,000,000,000,000 (one million million).Thanks that good to know.
1974 British prime minister Harold Wilson explained to the House of Commons that UK government statistics would from then on use the short scale [6][7]. During the last quarter of 20th century, most other English-speaking countries (Ireland, Australia, New Zealand, and South Africa) followed this lead and switched to the short scale use. However, in all these countries, some limited long scale use persists and the official status of the short scale use is not clear.

Similar Messages

  • Question about Communication with Oracle 7

    Hi,
    Generally we use Oracle Snapshots for Communication, but since Oracle 9.2.0.5 do not work with Oracle 7.1.5.2.4. we are thinking about do use Oracle Interconnect for this Link.
    Now I have the following two Questions about Communication with Oracle 7:
    1. Is an Adapter for Oracle 7.1.5.2.4 available?
    2. From your experience, does it make sense to install Oracle Interconnect for implementation of only one read only Link with 3 Tables if an OAS for BI is available?
    Thanks,
    Hannes

    1. The Oracle database adapter is for Oracle 8i and up. I checked the code needed in the database and doubt that will work in Oracle 7.
    1. The other option Advanced Queuing also needs 8i and up.
    2. For me it doesnt make sense that you want to connect to Oracle 7. Oracle 7 is unsupported and should not have to be developed against. Also the license costs for Oracle Interconnect ~17K without any discount is a bit steep for read only links to three tables. Having the OAS means you wont have to pay another additional ~17K for the apps server.

  • 3 important questions about SAP HANA

    Hello Community!
    I have got 3 important questions about SAP HANA.
    1.
    - Our Company has all up about 150 SAP - Systems.
    - 12 Systems are ERP-Systems.
    => We consider about SAP HANA at the moment.
    => Is it possible to put all the 12 ERP-Systems in ONE SAP HANA or do I need one seperate HANA for each ERP-System?
    => If it is possible, can you tell me the disadvantages when I use 1 HANA for the 12 ERP-Systems.
    => Is there a Storage-Limit in the HANA (I think no, because of 64 bit).
    => Can I still make realtime-analytics with 1 HANA for 12 SAP Systems?
    2.
    => Can I run ONE HANA with different non-SAP-Systems?
    3.
    => Is there a Demo-Center from SAP or IBM, where I can test and see the HANA before I buy it?
         (I do not mean the Fujitsu TDS Demo Center)
    It would be nice, when you would answer all my questions.
    Thank you.

    I have got 3 important questions about SAP HANA.
    1.
    - Our Company has all up about 150 SAP - Systems.
    - 12 Systems are ERP-Systems.
    => We consider about SAP HANA at the moment.
    => Is it possible to put all the 12 ERP-Systems in ONE SAP HANA or do I need one seperate HANA for each ERP-System? Yes it is quite possible to put all the ERP/CRM/SCM or other components on one HANA. You need to check for pricing/users/RAM (storage)
    => If it is possible, can you tell me the disadvantages when I use 1 HANA for the 12 ERP-Systems.
    There are no disadvantages as of now.. improve existing planning, forecasting, pricing
    optimization and other processes
    => Is there a Storage-Limit in the HANA (I think no, because of 64 bit).
    There is no limit on Stotrage: You need check how much Data you need on RAM (Real time processing) Storage.Minimum you can buy is 64 GB and multiple of it for (RAM)
    => Can I still make realtime-analytics with 1 HANA for 12 SAP Systems?
    Yes certainly..
    2.
    => Can I run ONE HANA with different non-SAP-Systems?
    YEs you can run HANA for 3rd party S/W or non-sap systems for your real time anlytics.
    It has ODBC and JDBC both and you can combine it with other systems.
    3.
    => Is there a Demo-Center from SAP or IBM, where I can test and see the HANA before I buy it?
    (I do not mean the Fujitsu TDS Demo Center)
    This you need to check with your local SAP Service channel..
    It would be nice, when you would answer all my questions.
    Thank you.
    PY

  • Question about function with in parameters

    Hello,
    I have a question about functions with in-parameters. In the HR schema, I need to get the minimum salary of the job_id that is mentioned as an in-parameter.
    this is what I am thinking but I dont know if it's correct or not or what should I do next!
    create or replace function get_minimum_salary (i_job_id in varchar2)
    return number
    as
    begin
    SELECT min_salary INTO min_sal
    FROM jobs
    where job_id = get_minimum_salary(xy);
    RETURN i_job_id;
    end get_minimum_salary;
    thanks in advance
    EDIT
    Thanks for your help all.
    Is it possible to add that if the i_job_id which is the in type parameter does not have a minimum salary then use the following function to register an error:
    create or replace procedure insert_error (i_error_code in number,
                                                      i_error_message in varchar2)
    as
    begin
    insert into error_table (error_user, error_date, error_code, error_message)
    values (user,sysdate,i_error_code,i_error_message);
    end insert_error;
    This function is basically to say that an error has occured and to register that error, at the same time I need to print out the error using the dbms_out.put_line.
    Any ideas of how to do that?
    Thanks again
    Edited by: Latvian83 on Jun 1, 2011 5:14 AM

    HI
    I have made little bit changes in ur code. try this
    create or replace function get_minimum_salary (i_job_id in varchar2)
    return number
    as
    v_Min_sal jobs.salary%type=0;---- Variable declaration
    begin
    SELECT min_salary INTO v_ min_sal
    FROM jobs
    where job_id = i_job_id;
    RETURN v_Min_sal;
    end get_minimum_salary;
    Regards
    Srikkanth.M

  • Few basic questions about database administration

    Hello,
    I have a few basic questions about database administration.
    1. I switched one of my oracle instances to archivelog mode. I just cannot locate the archive log files on my windows system. The %ora_home%/ora92/database/archive directory is desperately empty...
    2.What is the tools01.dbf datafile used for?
    3.What is the undotbso1.dbf datafile used for?
    Thanks in advance,
    Julien.

    1. The archive log location needs to be specified in your init.ora file. By default, Oracle will place the archive files in either ORACLE_HOME/dbs or ORACLE_HOME/database.
    2. The tools01.dbf file belongs to the tools tablespace which should be set as the default tablespace for system. It primary purpose is to hold Oracle Forms and Reports database objects, however, it should also be used for holding other non sys database objects such as the perfstat (statspack) or other third party database schemas e.g. quests sqllab.
    3. undotbs01.dbf file belongs to the undo tablespace.

  • Questions about database application tables connector

    Hi all,
    I need to manage with OIM several databases (each one with its custom tables).
    Also, one of them will be my trusted target on the initial load of users. The structure of our tables is not similar to the included in the example.
    The documentation explains how the OraAppX.xml could be modified but it doesn't speak about relations with "DBTable_nonTrusted.xml" and/or "DBTable_trusted.xml" For example, if I don't have a column called USR_COMM_LANG, could I replace this column name for another that I have in my database ? And what about the references on the "DBTable_nonTrusted.xml" file. Connector doc doesn't explain how references to that field (USR_COMM_LANG---xel_usr_comm_lang) are managed by the connector and if the normal working of the connector could be affected if the name of a variable is changed in the xml conf file.
    It could be that adapters of this connector were hardcoded with the referece to xel_usr_comm_lang (???) .
    The connector documentation includes a section titled "Adding Custom Database Columns for Provisioning and Reconciliation". The first impression reading this document is that it is possible to extend the predefined table rather than create and define a new structure of table.
    Has anyone rebuilt the "DBTable_nonTrusted.xml", "DBTable_trusted.xml" and "OraAppX.xml" files changing all default variables in order to manage a customized table? .
    Many thanks in advance,
    Claudia

    Hi Claudia,
    1) Yes, you must replace the column names in the "column" tag on your XML file for the column names that you have in your database tables. Also, remember that you need to configure the other parameters (like data_type, data_typ_size, etc) according to your database table.
    2) The "xel_data_source" parameter is the attribute name that will be recognized by the OIM and they can be mapped to a form field (UD_DBAPP_XXXX) in your process definition in OIM Design Console.
    3) You can customize your Database Application Tables connector for provisioning as you wish, but some issues exists. In the Known Issues section of the documentation you can see that this connector doesn't work for more than two tables.
    4) To customize your Database Application Tables for Trusted Recon, there are other restrictions because OIM users have some required fields that must be filled to create a user successfully.
    Hope that helps, and please let me know if you have more questions.
    Regards.

  • Questions about editing with io HD or Kona 3 cards

    My production company is switching from Avid to Final Cut Pro. I have a few editing system questions (not ingesting and outputting, just questions about systems for the actual editors - we will have mac pros with either kona 3 or io HD for ingest and outputs)
    1) Our editors work from home so they most likely will be using MacBook Pros - Intel Core 2 Duo 2.6GHz 4GB computers with eSata drives to work on uncompressed HD, will they be able to work more quickly in FCP if they are using the new Mac Pro 8-Core (2 Quad-Core 2.8GHz Intel Xeon) or will the mac book pro's be able to hold their own with editing hour long documentaries, uncompressed HD
    2) Will having an AJA Kona 3 (if we get the editors mac pros) or io HD (for the mac book pros) connected be a significant help to the editor's and their process, will it speed up their work, will it allow them to edit sequences without having to render clips of different formats? Or will they be just as well off without editing with the io HD?
    I'm just trying to get a better understanding of the necessity of the AJA hardware in terms of working with the editors to help them do what they have to do with projects that have been shot on many formats- DVCPro tapes, Aiptek cameras that create QTs and P2 footage.
    Thanks

    1. with the IoHD, laptops become OK working with ProRes and simply eSata setups. Without the Io, they can't view externally on a video monitor (a must in my book). It will not speed up rendering a ton, nor will it save renders of mixed formats. The idea is to get all source footage to ProRes with the Io, and then the Io also lifts the CPU from having to convert ProRes to something you can monitor externally on a video monitor, and record back to any tape format you want... all in real time.
    2. Kona 3's on Towers would run circles around render times on a Laptop... no matter what the codec, but the Kona does not really speed renders up. That's a function of the CPU and just how fast is it. (lots of CPU's at faster speeds will speed up render times).
    I'd recommend you capture to ProRes with Io's or the Kona 3 and don't work in uncompressed HD. You gain nothing doing it quality wise at all. And you only use up a ton of disk space (6 times the size in fact) capturing and working in uncompressed HD, which from your post, you're not shooting anyway. The lovely thing about ProRes is that it's visually lossless, efficient, and speeds up the editing process. Mixing formats can be done, but it's better to go to ProRes for all source footage, and edit that way.
    With either the Kona or the Io, you then can output to uncompressed HD tape... that's what they do for you no matter what codec you've edited in. ProRes is designed to be the codec of choice for all HD projects when you're shooting different formats especially... Get them all singing the same tune in your editing stations, and you'll be a much happier camper. Only reason to buy laptops is portability... otherwise you're much better off with towers and the Kona 3 speed wise.
    Jerry
    Message was edited by: Jerry Hofmann

  • Question about database structure - best practice

    I want to create, display, and maintain a data table of loan
    rates. These
    rates will be for two loan categories - Conforming and Jumbo.
    They will be
    for two loan terms - 15year and 30year. Within each term,
    there will be a
    display of -
    points (0, 1, 3) - rate - APR
    For example -
    CONFORMING
    30 year
    POINTS RATE APR
    0 6.375 6.6
    1 6.125 6.24
    3 6.0 6.12
    My first question is -
    Would it be better to set up the database with 5 fields
    (category, term,
    points, rate, apr), or 13 fields (category, 30_zeropointRate,
    30_onepointRate, 30_threepointRate, 30_zeropointAPR,
    30_onepointAPR,
    30_threepointAPR, 15_zeropointRate, 15_onepointRate,
    15_threepointRate,
    15_zeropointAPR, 15_onepointAPR, 15_threepointAPR)?
    The latter option would mean that my table would only contain
    two records -
    one for each of the two categories. It seems simpler to
    manage in that
    regard.
    Any thoughts, suggestions, recommendations?
    Murray --- ICQ 71997575
    Adobe Community Expert
    (If you *MUST* email me, don't LAUGH when you do so!)
    ==================
    http://www.projectseven.com/go
    - DW FAQs, Tutorials & Resources
    http://www.dwfaq.com - DW FAQs,
    Tutorials & Resources
    ==================

    Thanks, Pat. I'm pretty sure that this is a dead-end
    expansion. The site
    itself will surely expand, but I think this particular need
    will be
    informational only....
    Murray --- ICQ 71997575
    Adobe Community Expert
    (If you *MUST* email me, don't LAUGH when you do so!)
    ==================
    http://www.projectseven.com/go
    - DW FAQs, Tutorials & Resources
    http://www.dwfaq.com - DW FAQs,
    Tutorials & Resources
    ==================
    "Pat Shaw" <[email protected]> wrote in message
    news:[email protected]...
    > But if the site ever wants to expand on it's
    functionality etc. it can be
    > very difficult to get round a de-normalised database.
    You can find that
    > you have tied yourself in knots and the only solution is
    to go back and
    > redesign the database which often includes major
    redesigning of the
    > fron-end too.
    >
    > If you are confident that this will not be the case then
    go with your
    > initial thoughts but don't be too lenient just in case.
    Leave yorself a
    > little scope. I always aim for 3rd normal form as this
    guarantees a robust
    > database design without being OTT.
    >
    > Pat.
    >
    >
    > "Joris van Lier" <[email protected]> wrote in
    message
    > news:[email protected]...
    >>
    >>
    >> "Murray *ACE*"
    <[email protected]> wrote in message
    >> news:[email protected]...
    >>> I want to create, display, and maintain a data
    table of loan rates.
    >>> These rates will be for two loan categories -
    Conforming and Jumbo.
    >>> They will be for two loan terms - 15year and
    30year. Within each term,
    >>> there will be a display of -
    >>>
    >>> points (0, 1, 3) - rate - APR
    >>>
    >>> For example -
    >>>
    >>> CONFORMING
    >>> 30 year
    >>> POINTS RATE APR
    >>> ----------- --------- ------
    >>> 0 6.375 6.6
    >>> 1 6.125 6.24
    >>> 3 6.0 6.12
    >>>
    >>> My first question is -
    >>>
    >>> Would it be better to set up the database with 5
    fields (category, term,
    >>> points, rate, apr), or 13 fields (category,
    30_zeropointRate,
    >>> 30_onepointRate, 30_threepointRate,
    30_zeropointAPR, 30_onepointAPR,
    >>> 30_threepointAPR, 15_zeropointRate,
    15_onepointRate, 15_threepointRate,
    >>> 15_zeropointAPR, 15_onepointAPR,
    15_threepointAPR)?
    >>>
    >>> The latter option would mean that my table would
    only contain two
    >>> records - one for each of the two categories. It
    seems simpler to
    >>> manage in that regard.
    >>>
    >>> Any thoughts, suggestions, recommendations?
    >>
    >> In my opinion, normalizing is not necessary with
    small sites, for example
    >> the uber-normalized database design I did for the
    telcost compare matrix
    >> (
    http://www.artronics.nl/telcostmatrix/matrix.php
    ) proved to be totally
    >> overkill.
    >>
    >> Joris
    >
    >

  • Question about Database Access

    Hello,
    I have one question regarding database access from Java.
    What are the other ways of connecting to database other than JDBC? This was one of the question I been asked by the interviewer. He says JDBC connection can slow down the perfomance when it comes large number of transactions?
    Any insight or help will be very useful.
    Thanks

    He may have been referring to available "persistance managers" like Hibernate or the mechanisms build into EJB containers. However, as far as I know, they all use JDBC, and any speed-up is due to caching of data as objects.

  • Questions about rebuilding with end goal of mortgage in 1.5 -2 years

    Hello all. I have been reading the forum diligently for the last few months, but I’m in a conundrum as to what to do next in my rebuilding process and was hoping for advice. Here’s a snapshot of my current situation. In 2008- 2009 I had a sudden lay off and a long term relationship end. I was left holding what seemed like insurmountable debt.  I paid the collections (didn’t know about PFD) but was still left with a credit report that had a bunch of lates, a collections account and a paid judgement.  All accounts closed on me with the exception of a JCP that I had forgotten about, but American Express sent me a preapproval for the Optima Card with a 700 CL. I took it gratefully, and went off to lick my woundsAfter years of struggling in a few low paying jobs I’ve made it back to my former status salary wise and I’m looking to get myself into a small home in the next 1.5 to 2 years. What’s on my credit report? JCP( 2008) – 4800 CL again thanks to his forum, hit the luv button and got a huge CLIAmex Optima ( 2009) – 2100 CL (thanks to the 3x Amex thread I got my first CLI in years on this card)Credit Union Visa(3/ 2013)- 2500 – this card hasn’t budged in two years, but is my lowest interest rate.Care Credit (10/2014) -1500 CL wish I had known about the recon when I apped for this card. It's too low to be useful to me.Discover(4/2015)- 6000 CLAAoA is 7.9 years. Credit history is 18 + Years. Utilization is 9%1 paid CO on Experian due to fall off August 2015.1 paid judgement on all 3 CRAs due to fall off in April of 20161 inquiry on TU and 1 Inquiry on Equifax. None on Experian.No lates.CK Fakos 750, Discover TU Fico 712, Experian 665 My question is…what next?  I would love to app for the American Express Blue Cash Everyday card and the Chase Freedom in October, but it occurs to me that I have no loan info on my report as my old car loans and home loans were from nearly 16 years ago. I want to avoid a car payment until I buy a house, however, does it make more since to get a loan from my CU to flesh out my report or should I go ahead and app for those two credit cards after that Collection falls off and my Discover ages a bit?   Thanks in advance for any insites.

    PDXgigi wrote:
    Thanks Dagsky. Any ideas on how long I should take to pay the loan back for best score?  I'm wondering if I should I have it paid off before I look for a mortgage. Also, what do you consider to be small? I was thinking of taking a year to pay back 1000 to 1500. I have some concerns about score dropping when the loan was paid off as well. 
    Thanks again!Small as in no more than $500. Terms as long as you can get.  If your CU won't go more than 12 months, go somewhere else. SDFCU for example, will go 36 months on a $500 secured loan at 3.99%, monthly payments of $14 and change. 

  • Newbie questions about Database replication, "backups", and sql version

    Just created my first Sql Database using the "SQL Database Management Portal". A blast but three questions come to mind:
    1. Is this DB automatically
    "replicated"? (In other words,  is DB "failover" or clustering,  a feature of all newly created Azure databases?) 
    2. Our on-premise model is to make a daily DB backup which is saved to a nightly tape. If needed, we can restore a two-month old database backup under a new name to compare it with the current one. Does Sql Azure support this capability
    or not? Can it be requested?
    3.  Which on-premise version of sql is "Sql Azure" closest to? (2014?)
    TIA,
    edm2
    P.S. My database was created using the "web" edition.

    Hi edm,
    According to your description, you create a SQL Azure database in Azure platform. The replication feature is not supported by Microsoft Azure SQL database. If you want to sync the SQL Azure database and local database, you can use SQL Azure Data Sync service.
    For more information, see:
    http://blogs.technet.com/b/the_cloud_pilot/archive/2011/10/24/your-first-sql-azure-data-sync-step-by-step.aspx
    In addition, if you have Web or Business Edition databases, you must create your own backup strategy. You can use database copy or Import and Export services to create copies of the data and export the file to an Microsoft Azure storage account. Meanwile,
    Windows Azure SQL Database provides a mechanism for automating the process for exporting a database to BACPAC files on a set interval and frequency. For more information, see:
    Schedule an Automated Export:
    http://msdn.microsoft.com/en-us/library/hh335292.aspx#automate
    Windows Azure SQL Database Backup and Restore strategy:
    http://www.mssqltips.com/sqlservertip/3057/windows-azure-sql-database-backup-and-restore-strategy/
    Currently, Azure uses a special version of Microsoft SQL Server as its backend. It provides high availability by storing multiple copies of databases, elastic scale and rapid provisioning, when we check the version of SQL Server, it shows as follows.
    Microsoft SQL Azure (RTM) - 11.0.9216.62
    Regards,
    Sofiya Li
    If you have any feedback on our support, please click
    here.
    Sofiya Li
    TechNet Community Support

  • Questions about Database roles

    Hi, 
    Need to setup a new (Windows) user which has varied access to tables, views and procedures in about 8 different schemas all in the same database. 
    I've created a instance level login and then gone to the database to set the more granular details. 
    First question.  Does this user need to own the schemas they will access? 
    Secondly, I'm assumign the best bet would be to create a database role and then apply the privs against that? 
    I need to give access to all stored procedures (and future procedures) in a couple of these schemas and none in the others.  Is it possible to grant execute on all procedures in a schema whist prohibiting others? 
    Then I would assign the database role to the new instance login? 
    Thanks 

    In the database he has access to a number of schemas but I wish to explicitly exclude him from all views in those schemas and in addition to this all the sys catalogs, e.g
    Also have no idea how to restrict access to the user created views without doing them all manually, but then what happens in the future if new views are added? They are not going to be explicitly denied. 
    David nailed the problem.
    There is no way to differentiate SELECT permission between Tables and Views. If they are all in the same schema and tables are allowed and views not.. out of luck and have to include the allowed object one by one (as opposed to denying the not allowed ones
    with the risk of missing some in the future)
    This is why database design should have security in mind from the very beginning and views, being a way to access (aka "access-schema") data in tables should ideally be placed in a separate schemas
    Andreas Wolter (Blog |
    Twitter)
    MCM - Microsoft Certified Master SQL Server 2008
    MCSM - Microsoft Certified Solutions Master Data Platform, SQL Server 2012
    www.andreas-wolter.com |
    www.SarpedonQualityLab.com

  • Question about Databases on a distributed environment...

    Hi,
    I have quick question. We have production in a distributed environment as follows
    (a) SQL server, EPMA, and Calc Manager
    (b) Workspace, and Shared Services
    (c) Essbase
    (d) Planning
    Now we have multiple databases for each hyperion service i.e.
    (1) HSS (using for hyperion shared service)
    (2) BIPLUS (Using for workspace)
    (3) AAS (Using for Essbase Administrator Services)
    (4) CALCMGR (Using for calculation manager)
    (5) EPMA (Using for EPMA)
    (6) ERPI (Using for ERP Integrator)
    (7) PLANSYS (Using for Planning)
    Now, my question is ....is it necessary that we have that multiple database in a distributed environment or we can have one database for everything? What is main objective for creating databases for each application?
    Please share your best knowledge and give me positive and negative about multiple database and single database for hyperion...
    Thanks to all...
    Safi

    I am going to be lazy and copy an extract from the install doc
    For simplicity and ease of deployment, you can use one database repository for all products (with the exceptions noted below). When you configure multiple products at one time using EPM System Configurator, one database is configured for all selected products.
    Caution!
    To use a different database for each product, perform the “Configure Database” task separately for each product. In some cases you might want to configure separate databases for products. Consider performance, rollback procedures for a single application or product, and disaster recovery plans.
    The following products and product components require unique databases:
    Performance Management Architect interface data source
    Extended Analytics for Financial Management and Extended Analytics for Strategic Finance
    Planning – Each Planning application should have its own repository.
    Performance Scorecard
    FDM – Use an Oracle Database instance exclusively for FDM.
    Data Relationship Management.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Some important question about JPA!!!

    Hi there, I have a web application written in Java 1.4
    I want to move this to Java 1.5 using JPA but after review old code I've got some stuck:
    1. This application have a function that get Lists from a table:
    Collection getList(String className, string tableName)
    That do follwing tasks:
    Get set of properties from class schema (by className)
    Get record set from table
    Each row of record set do:
    - Create object and set properties value from column that named the same name of property.
    - Add object to collection
    Return collection
    As my knowledge, in JPA, each entity class schema is mapped with a table schema. So, this function can't migrate. But in my application case, there are about 50 tables, mapping entities to each table is not good idea. Is there another solution?
    2. In database design, a table have composite PKs that merge from many part, each part is a property of entity too, for example table User have column User_ID = Agency_ID + Deparment_ID + SeqNum. How I can map Agency and Department properties of User entity to the table User?
    Please help me, thanks
    Thank you.

    Hi uvnet,
    I think the old code can run in java 1.5 with a minor change but the problem is I want take advantage of using 1.5 especially JPA (big change :D ).
    Anyway, the sample code would be:
    import java.sql.ResultSet;
    import java.util.ArrayList;
    import java.util.Collection;
    import java.util.Iterator;
    import java.util.Map;
    import org.apache.commons.beanutils.BeanUtils;
    import org.apache.commons.beanutils.PropertyUtils;
    public class DAO {
         public Collection getList(String className, String tableName, int encoding) {
              Object objDescriber = null;          
              ResultSet rs = null;
              Map map = null;
              Iterator properties = null;
              ArrayList<Object> returnValue = new ArrayList<Object>();
              try {
                   objDescriber = (Object)Class.forName(className).newInstance();
                   try {
                        map = BeanUtils.describe(objDescriber);
                        properties = map.keySet().iterator();
                        //Get ResultSet from Table
                        rs = getResultSet(tableName);
                        while (rs.next()) {
                             while (properties.hasNext()) {
                                  String name = (String) properties.next();
                                  String typeName = PropertyUtils.getPropertyType(objDescriber, name).getName();
                                  //Get value from current row and column name of ResultSet
                                  Object value = getValue(rs, name, encoding);
                                  try {
                                       Object obj = (Object)Class.forName(className).newInstance();
                                       PropertyUtils.setProperty(obj, typeName, value);
                                       returnValue.add(obj);
                                  catch (Exception ex) {
                   catch (Exception ex) {
              catch (Exception ex) {
              return returnValue;
    }

  • Question about VPN with Server 2012

    I am a programmer, but I know enough about network and server configs to be slightly dangerous when it comes to configurations.
    My task today was to try to setup VPN access at our office. We have a 2012R2 standard server that was not doing anything (not even on), so I decided it would be the perfect machine for the job.
    To make this as short as possible, we are a real small office, 3 guys. We have a high end (although consumer grade) router that is not a VPN end point, but does support VPN tunneling through its firewall. Router is DHCP server, however most workstations
    and servers in the office are setup with statics.
    I got everything to work (pats self on back) to allow me to create a new VPN connection from an outside Windows client machine, and everything worked as I wanted it to, except for machine name resolution. I understand this is due to limitations in NetBIOS
    and how that all works. I found an article about setting up a WINS server, and once I did that, it resolved my issue and I was able to connect to shares or RDP to machines by name instead of by local IP address.
    So I came here to ask this:
    1) The WINS server method required me to manually add each machine/IP I wanted to register. I am not sure why this is, but I am thinking it has to do with the server not acting as the DHCP/DNS server (or maybe this is just how WINS works?). This isn't a
    HUGE deal, but it would be nice if I didn't have to manually manage this list. If Windows clients on the local network here can broadcast and see all available network machines, why can't the WINS server make the same network crawl automatically?
    2) In reading about the WINS server, most people said it was a relic and one should use DNS instead. I am looking for more information on this, to see if this is something I should consider or worry about.
    3) Would I make my life easier if I made the 2012 server both DHCP and DNS server and removed those tasks from the router?
    A side note if it matters, we are NOT on a domain here, just a loose set of workgroup machines. We find that setting up a whole DC and adding all these machines and accounts to a domain is counter productive for us, because we are such a small shop here,
    with nothing to restrict from each other, and we find that adding a domain into the picture complicates things more than it simplifies things. Of course unless someone can make a case why we should just move to being on a DC.
    Matt Kleinwaks - MSMVP MSDN Forums Moderator - www.zerosandtheone.com

    Hi,
    There have the Browser Elections procedure when you use WINS service, but the WINS service in domain environment and workgroup environment have the different Browser Election.
    By default, all domain controllers in a domain are browser servers.
    You can refer the following KB to understand the procedure:
    Browser Elections
    http://technet.microsoft.com/en-us/library/cc959896.aspx
    How Computer Browser Service Works
    http://technet.microsoft.com/en-us/library/cc737661(v=ws.10).aspx
    Description of the Microsoft Computer Browser Service
    http://support.microsoft.com/kb/188001
    If you install a WINS server the WINS server will handles name registration requests from WINS clients, register their names and IP addresses, and responds to NetBIOS name
    queries submitted by clients, returning the IP address of a queried name if it is listed in the server database.
    WINS provides the following benefits for administering your TCP/IP-based network:
    A dynamic name-to-address database that maintains support for computer name registration and resolution.
    Centralized management of the name-to-address database, alleviating the need for managing Lmhosts files.
    Reduction of NetBIOS-based broadcast traffic on subnets, by permitting clients to query WINS servers to directly locate remote systems.
    Support for earlier Microsoft® Windows® and NetBIOS-based clients on your network, permitting these types of clients to browse lists for remote Windows domains without requiring
    a local domain controller to be present on each subnet.
    Support for DNS-based clients by enabling those clients to locate NetBIOS resources when WINS lookup integration is implemented.
    The related KB:
    WINS servers
    http://technet.microsoft.com/en-US/library/cc781189
    Windows Internet Name Service (WINS) Overview
    http://technet.microsoft.com/en-US/us-en/library/cc725802.aspx
    Hope this helps.

Maybe you are looking for