BC4J Best Practices

Hi,
I am looking for any best practices doc from Oracle or anyone on implementing features in BC4J objects. Examples will be "Where a method/feature should go?"
I am facing a choice of writing the insert, update and delete methods in the view objects, but it is equally possible and correct if I have them in the app modules, by obtaining a handle to the view object. But which one is the best practice?
Please add your comments and suggestion to this thread?
Patrick.

Hi,
We use only Stateful release modes for application modules, defined in the action mappings in struts-config.xml exactly the same way as in your example. Stateful mode releases the module instance back to the pool and it can be reused by other sessions as well. However, all the code that uses the app modules and view objects, etc, must be written with the assumption that the module or the view object the code is operating on can be a different instance from the one in the previous request in the same session.
The concept of BC4J is that this recycling of modules should be transparent for the users of the app modules, but this is not exactly the case. Some things are not passivated in the am's snapshots and are not activated in case of recycling, for example, custom view object properties or entries in the userData map (or at least were not in 9.0.5, I doubt this is changed in 10.1.2.) These are things that you have to manually passivate and activate if you use them to store some information that is relevant to a particular user session.
All chances are that these strange things that you experience only occur in sessions that use recycled application modules, that is, there was passivation and subsequent activation of vo and am states. I have found it useful as a minimum to test the application with only 1 application module in the pool and at least 2 user sessions, constantly recycling this one am instance. Many of the problems that will surface in a real application usage only when there is a high load can be experienced in this artificial setup.

Similar Messages

  • BC4J ApplicationModule - best practice in Stateful web application?

    When writing a stateful web application that uses BC4J framework as the model, what is considered best practice in using the Application Module. Is it okay to store an AM instance in a HttpSession, or should we opt for the features explained in the BC4J Pooling samples, which uses the SessionCookie interface?
    Tips/Tricks/Pitfalls information welcome
    Thx,

    When writing a stateful web application that uses BC4J framework as the model, what is considered best practice in
    using the Application Module. Is it okay to store an AM instance in a HttpSession, or should we opt for the features
    explained in the BC4J Pooling samples, which uses the SessionCookie interface? Best practice is to store the SessionCookie (an ApplicationModule handle) as demonstrated in the pooling sample.
    This will allow many advantages including scalable state management support, timeout support, and
    failover / clustering support.
    Caching the ApplicationModule directly can be dangerous because:
    1. The AM is not serializable which could result in serialization exceptions if the servlet container were distributable.
    2. The AM does not responsd to timeout which could result in memory leaks if the AM is not explicitly returned to the
    pool at the end of each request.
    3. For stateful applications the memory consumed by each AM could be significant. Even if the AM were correctly
    released to the pool upon session timeout it would still have consumed that memory up to that point. Using the
    SessionCookie along with state managed release allows for scalable state management.
    Tips/Tricks/Pitfalls information welcome
    Thx,

  • Best practice recommendation--BC set

    Dear friends,
       I am using BC set concept to capture my configurations. I am a PP Consultant.
    Let us consider, one scenario, Configure plant parameters in t.code OPPQ.
    My requirement is:
    A.   Define Floats: (Schedule Margin key)
    SM key: 001
    opening period: 1day
    Float before prod: 2day
    Float After prod: 1 day
    Release period: 1 day
    B.   Number range
    Manitain internal number range as: 10-from:010000000999999999. (for planned orders)
    This is my configuration requirement.
    Method M1:
    Name of the BC set: ZBC_MRP1
    while creating BC set first time, while defining floats, i have wrongly captured/ activated the opening periodas 100, instead of 001. But i have correctly captured the value for number range (for my planned orders)
    Now if u see the activation log for my BC set, my BC set is in "GREEN" light--Version1, successfully activated, but activated values are wrong)
    So, i want to change my BC set values. Now i want to reactivate my BC set with correct value. Now i am again activating the same BC set with corret value of opening period (Value as 001 ). After reactivating the BC set, if i get into my BC set activation log, one more version (version 2) has appeared with "GREEN" light.
    So in my activation log, two BC sets are visible.
    If i activate Version 1---wrong values will be updated in configuration
    If i activate Version 2---corrrect values will be activated in configurations
    But both versions can be activated at any point of time. The latest activated version will be alwyas in top.
    <b>So method 1 (M1) talks about, with one BC set name, maintain different versions of BC set.</b>...Based on your requirement activate the versions
    Method 2 (M2)
    Instead of creating versions within a same BC set, create one more BC set to capture new values.
    So if i activate second BC set, the configuration will be updated.
    Please suggest me, which method is best practice( M1 or M2)?
    Thanks
    Senthil

    I am familiar with resource bundles, but wonder if there is a better approach within
    JDeveloper. Resourcebundles are the java-native way of handling locale-specific texts.
    Are there any plans to enhance this area in 9.0.3? For BC4J, in 9.0.3, all control-hints and custom-validation messages (new feature) are generated in resource-bundles rather than xml-files to make it easier to "extend" for multiple locales.

  • Best practices Struts for tech. proj. leads

    baseBeans engineering won best training by readers of JDJ and published the first book on Struts called FastTrack to Struts.
    Upcoming class is live in NYC, on 5/2 from 7:30 AM to 1:00PM. We will cover db driven web site development, process, validation, tiles, multi row, J2EE security, DAO, development process, SQL tuning, etc.
    We will teach a project tech lead methods that will increase the productivity of his team and review best practices, so that they can benchmark their environment.
    Sign up now for $150, the price will be $450 soon as we get closer to the date (price goes up every few days). The web site to sign up on is baseBeans.net* .
    You will receive a lab/content CD when you sign up.
    Contact us for more details.
    ·     We preach and teach simple.
    ·     We use a very fast DAO DB Layer – with DAO side data cache
    ·     We use JSTL
    ·     We use a list backed Bean w/ DAO helper design for access to any native source and to switch out DAO.
    ·     We use J2EE security, container managed declarative authorization and authentication. (no code, works on any app. server).
    ·     Struts based Content Management System. A Struts menu entry like this:
    <Item name="About_Contacts"      title="About/Contacts"
    toolTip="About Us and Contact Info" page="/do/cmsPg?content=ABOUT" />
    passes to the action the parm of “about” which the DAO populates.
    You can peak at the source code at sourceforge.net/projects/basicportal or go to our site baseBeans.net. (16,000 downloads since Oct. 2002)
    Note that the baseBeans.net is using the Content Management System (SQL based) that we train on. (our own dog food)
    Note: We always offer money back on our public classes.
    Vic Cekvenich
    Project Recovery Specialist
    [email protected]
    800-314-3295
    <a href =”baseBeans.net”>Struts Training</a>
    ps:
    to keep on training, details, best practice, etc. sign up to this mail list:
    http://www.basebeans.net:8080/mailman/listinfo/mvc-programmers
    (1,000 + members)

    Hi,
    We use only Stateful release modes for application modules, defined in the action mappings in struts-config.xml exactly the same way as in your example. Stateful mode releases the module instance back to the pool and it can be reused by other sessions as well. However, all the code that uses the app modules and view objects, etc, must be written with the assumption that the module or the view object the code is operating on can be a different instance from the one in the previous request in the same session.
    The concept of BC4J is that this recycling of modules should be transparent for the users of the app modules, but this is not exactly the case. Some things are not passivated in the am's snapshots and are not activated in case of recycling, for example, custom view object properties or entries in the userData map (or at least were not in 9.0.5, I doubt this is changed in 10.1.2.) These are things that you have to manually passivate and activate if you use them to store some information that is relevant to a particular user session.
    All chances are that these strange things that you experience only occur in sessions that use recycled application modules, that is, there was passivation and subsequent activation of vo and am states. I have found it useful as a minimum to test the application with only 1 application module in the pool and at least 2 user sessions, constantly recycling this one am instance. Many of the problems that will surface in a real application usage only when there is a high load can be experienced in this artificial setup.

  • Best practices question

    I've been starting to work with JDev 9.0.3 and web services, and I can see clearly how to:
    -- take my existing database information and create a web service for it. I've done this with PL/SQL, and will start trying to do it with BC4J shortly.
    -- take an existing WSDL and generate a stub to call it, and then use Java to make sense out of the structured information that comes back.
    But it looks like my real task will be to:
    -- import an existing XML schema (created elsewhere)
    -- map that to my relational database structure
    -- build a webService that will deliver that schema, based on information in the database
    (we'll also be going the other direction and doing inserts and updates).
    We're expecting to do a lot of these. I want to make it as simple as possible for those who will follow me. And I'd like to minimize the work involved when the schema changes, which it certainly will. I can see parts of the picture, but I'm sure I don't see the whole thing.
    So does anybody have any suggestions?
    -- jim

    I am not sure I can claim best practices but can point you at some solutions that may help shape your thinking:
    -- import an existing XML schema (created elsewhere) Here you may have two choices for processing from the mid-tier:
    1. Deserialize the incoming XML document into some sort of Java object so you can deal with it programmatically - see the above message on MS SOAP/Oralce SOAP interoperability for an example of this that ships with OC4J - then do JDBC inserts based on those Java objects into the DB.
    2. Work with it as an XML document - JDev 9.0.3 should produce a client to document based Web services. Here you can use either Oracle9i XMLDB or the XML SQL Utility to map to the DB.
    I am not sure one approach is better than another. How you deal with your second point will probably drive the choice.
    -- map that to my relational database structure
    With Oracle as the receiving point, you have a number of choices:
    1. Oracle9i DB R2. This will let you register an XML Schema in the database and using extra annonations in the schema, the database itself will actually map incoming documents defined by schemas directly to tables. This would imply in part 1 of my answer, keeping the incoming document in a DOM format. See the new XML Center, 2nd headline for all info on XML DB, particularly the rather extensive demo that just came out with that launch.
    2. XML SQL Utility. This utility lets you programmatically take an XML document and if it is in a canonical format (<ROWSET><ROW><COLNAME1>xxx</COLNAME1><COLNAME2>xxx</COLNAME2>...</ROW><ROWSET>) lets you do inserts, updates, deletes. The trick here would be doing a transformation from the incoming schema to the appropriate canonical format. This to suggests keeping the incoming Web service in XML format
    3. Deserialize into Java objects and then programmatically do the JDBC processing to insert, update, delete etc.
    -- build a webService that will deliver that schema, based on information in the database
    Here XSU might be quite nice. It has the ability to deliver the schema as part of the body of the XML document extracted from the database ... any select statement with this extra parameter will including the header of the document the schema (or DTD). See:
    Intro:
    http://otn.oracle.com/docs/products/oracle9i/doc_library/release2/appdev.920/a96621/adx01bas.htm#1002519
    and
    More detailed, including mapping issues:
    http://otn.oracle.com/docs/products/oracle9i/doc_library/release2/appdev.920/a96621/adx08xsu.htm
    and for the schema generation, see the extra parameter to the XMLQuery method:
    http://otn.oracle.com/docs/products/oracle9i/doc_library/release2/appdev.920/a96616/arxml08.htm#1004604
    The dept/emp service here on OTN does exactly that using the XSU (sans the schema information). See:
    http://otn.oracle.com/tech/webservices/htdocs/series/deptemp/content.html
    As for which is better, I think it depends on where you are coming from. If you want to take advantage of your investment in the Oracle DB, the XML DB is probably the more powerful approach. If you like doing all the work in the middle tier, the XSU or deserialization approach mentioned above might fit better.
    Mike.

  • Best practice recommendation for locale-specific text/labels

    What is the recommended best practice approach to supporting locale-specific
    text for labels, messages when using Jdeveloper to create applets and applications.
    I am familiar with resource bundles, but wonder if there is a better approach within
    JDeveloper. Are there any plans to enhance this area in 9.0.3?

    I am familiar with resource bundles, but wonder if there is a better approach within
    JDeveloper. Resourcebundles are the java-native way of handling locale-specific texts.
    Are there any plans to enhance this area in 9.0.3? For BC4J, in 9.0.3, all control-hints and custom-validation messages (new feature) are generated in resource-bundles rather than xml-files to make it easier to "extend" for multiple locales.

  • Best practicies exposing AM (OAF 11.5.10) as webservice to external systems

    IHAC how is developing extensions to there ebusiness install base using OAF 11.5.10 and they have approached me with questions on how they could expose some of the business services developed (AM VO mainly) as webservices to be used in a BPEL/Webservice framework. The BPEL service is seebeyond (not sure how it is spelled) and not Oracle's.
    I have outlined 2 ways, but since I have not developed anything on OAF I have no idea it is possible.
    First was to migrate the ADF BC (or BC4J) projects from OAF with JDeveloper 10.1.3 and just more or less create a simple facade layer of a session bean right-click and deploy as webservice.
    Second: was to use a webservice library such as axis to be used in JServ directly to expose them on the "target" server.
    Has anyone any best practicies on this topic,

    For recognition and stable functionality of USB devices, your B&W G3 should be running OS 8.6 minimally. The downloadable OS 8.6 Update can be run on systems running OS 8.5/8.5.1. If (after updating to 8.6), your flash drive still isn't recognized, I'd recommend downloading the OS 9.1 Update for the purpose of extracting its newer USB support drivers, using the downloadable utility "TomeViewer." These OS 9.1 USB support files can be extracted directly to your OS 8.6 Extensions folder and are fully compatible with the slightly older OS software. It worked for me, when OS 8.6's USB support files lacked a broad enough database to support my first USB flash drive.

  • Logical level in Fact tables - best practice

    Hi all,
    I am currently working on a complex OBIEE project/solution where I am going straight to the production tables, so the fact (and dimension) tables are pretty complex since I am using more sources in the logical tables to increase performance. Anyway, what I am many times struggling with is the Logical Levels (in Content tab) where the level of each dimension is to be set. In a star schema (one-to-many) this is pretty straight forward and easy to set up, but when the Business Model (and physical model) gets more complex I sometimes struggle with the aggregates - to get them work/appear with different dimensions. (Using the menu "More" - "Get levels" does not allways give the best solution......far from). I have some combinations of left- and right outer join as well, making it even more complicated for the BI server.
    For instance - I have about 10-12 different dimensions - should all of them allways be connected to each fact table? Either on Detail or Total level. I can see the use of the logical levels when using aggregate fact tables (on quarter, month etc.), but is it better just to skip the logical level setup when no aggregate tables are used? Sometimes it seems like that is the easiest approach...
    Does anyone have a best practice concerning this issue? I have googled for this but I haven't found anything good yet. Any ideas/articles are highly appreciated.

    Hi User,
    For instance - I have about 10-12 different dimensions - should all of them always be connected to each fact table? Either on Detail or Total level.It not necessary to connect to all dimensions completely based on the report that you are creating ,but as a best practice we should maintain all at Detail level only,when you are mentioning any join conditions in physical layer
    for example for the sales table if u want to report at ProductDimension.ProductnameLevel then u should use detail level else total level(at Product,employee level)
    Get Levels. (Available only for fact tables) Changes aggregation content. If joins do not exist between fact table sources and dimension table sources (for example, if the same physical table is in both sources), the aggregation content determined by the administration tool will not include the aggregation content of this dimension.
    Source admin guide(get level definition)
    thanks,
    Saichand.v

  • Best practices for setting up users on a small office network?

    Hello,
    I am setting up a small office and am wondering what the best practices/steps are to setup/manage the admin, user logins and sharing privileges for the below setup:
    Users: 5 users on new iMacs (x3) and upgraded G4s (x2)
    Video Editing Suite: Want to connect a new iMac and a Mac Pro, on an open login (multiple users)
    All machines are to be able to connect to the network, peripherals and external hard drive. Also, I would like to setup drop boxes as well to easily share files between the computers (I was thinking of using the external harddrive for this).
    Thank you,

    Hi,
    Thanks for your posting.
    When you install AD DS in the hub or staging site, disconnect the installed domain controller, and then ship the computer to the remote site, you are disconnecting a viable domain controller from the replication topology.
    For more and detail information, please refer to:
    Best Practices for Adding Domain Controllers in Remote Sites
    http://technet.microsoft.com/en-us/library/cc794962(v=ws.10).aspx
    Regards.
    Vivian Wang

  • Add fields in transformations in BI 7 (best practice)?

    Hi Experts,
    I have a question regarding transformation of data in BI 7.0.
    Task:
    Add new fields in a second level DSO, based on some manipulation of first level DSO data. In 3.5 we would have used a start routine to manipulate and append the new fields to the structure.
    Possible solutions:
    1) Add the new fields to first level DSO as well (empty)
    - Pro: Simple, easy to understand
    - Con: Disc space consuming, performance degrading when writing to first level DSO
    2) Use routines in the field mapping
    - Pro: Simple
    - Con: Hard to performance optimize (we could of course fill an internal table in the start routine and then read from this to get some performance optimization, but the solution would be more complex).
    3) Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine).
    Does anybody know what is best practice is? Or do you have any experience regarding what you see as the best solution?
    Thank you in advance,
    Mikael

    Hi Mikael.
    I like the 3rd option and have used this many many times.  In answer to your question:-
    Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized  - Yes have read and tested this that it works faster.  A OSS consulting note is out there indicating the speed of the end routine.
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine). - Yes but by using the result package, the manipulation can be done easily.
    Hope it helps.
    Thanks,
    Pom

  • Temp Tables - Best Practice

    Hello,
    I have a customer who uses temp tables all over their application.
    This customer is a novice and the app has its roots in VB6. We are converting it to .net
    I would really like to know the best practice for using temp tables.
    I have seen code like this in the app.
    CR2.Database.Tables.Item(1).Location = "tempdb.dbo.[##Scott_xwPaySheetDtlForN]"
    That seems to work, though i do not know why the full tempdb.dbo.[## is required.
    However, when i use this in the new report I am doing I get runtime errors.
    i also tried this
    CR2.Database.Tables.Item(1).Location = "##Scott_xwPaySheetDtlForN"
    I did not get errors, but I was returned data i did not expect.
    Before i delve into different ways to do this, i could use some help with a good pattern to use.
    thanks

    Hi Scott,
    Are you using the RDC still? It's not clear but looks like it.
    We had an API that could piggy back the HDBC handle in the RDC ( craxdrt.dll ) but that API is no longer available in .NET. Also, the RDC is not supported in .NET since .NET uses the framework and RDC is COM.
    Work around is to copy the temp data into a data set and then set location to the data set. There is no way that I know of to get to the tempdb from .NET. Reason being is there is no CR API to set the owner of the table to the user, MS SQL Server locks the tempdb to that user has exclusinve rights on it.
    Thank you
    Don

  • Best Practice for Significant Amounts of Data

    This is basically a best-practice/concept question and it spans both Xcelsius & Excel functions:
    I am working on a dashboard for the US Military to report on some basic financial transactions that happen on bases around the globe.  These transactions fall into four categories, so my aggregation is as follows:
    Year,Month,Country,Base,Category (data is Transaction Count and Total Amount)
    This is a rather high level of aggregation, and it takes about 20 million transactions and aggregates them into about 6000 rows of data for a two year period.
    I would like to allow the users to select a Category and a country and see a chart which summarizes transactions for that country ( X-axis for Month, Y-axis Transaction Count or Amount ).  I would like each series on this chart to represent a Base.
    My problem is that 6000 rows still appears to be too many rows for an Xcelsius dashboard to handle.  I have followed the Concatenated Key approach and used SUMIF to populate a matrix with the data for use in the Chart.  This matrix would have Bases for row headings (only those within the selected country) and the Column Headings would be Month.  The data would be COUNT. (I also need the same matrix with Dollar Amounts as the data). 
    In Excel this matrix works fine and seems to be very fast.  The problem is with Xcelsius.  I have imported the Spreadsheet, but have NOT even created the chart yet and Xcelsius is CHOKING (and crashing).  I changed Max Rows to 7000 to accommodate the data.  I placed a simple combo box and a grid on the Canvas u2013 BUT NO CHART yet u2013 and the dashboard takes forever to generate and is REALLY slow to react to a simple change in the Combo Box.
    So, I guess this brings up a few questions:
    1)     Am I doing something wrong and did I miss something that would prevent this problem?
    2)     If this is standard Xcelsius behavior, what are the Best Practices to solve the problem?
    a.     Do I have to create 50 different Data Ranges in order to improve performance (i.e. Each Country-Category would have a separate range)?
    b.     Would it even work if it had that many data ranges in it?
    c.     Do you aggregate it as a crosstab (Months as Column headings) and insert that crosstabbed data into Excel.
    d.     Other ideas  that Iu2019m missing?
    FYI:  These dashboards will be exported to PDF and distributed.  They will not be connected to a server or data source.
    Any thoughts or guidance would be appreciated.
    Thanks,
    David

    Hi David,
    I would leave your query
    "Am I doing something wrong and did I miss something that would prevent this problem?"
    to the experts/ gurus out here on this forum.
    From my end, you can follow
    TOP 10 EXCEL TIPS FOR SUCCESS
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/204c3259-edb2-2b10-4a84-a754c9e1aea8
    Please follow the Xcelsius Best Practices at
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a084a11c-6564-2b10-79ac-cc1eb3f017ac
    In order to reduce the size of xlf and swf files follow
    http://myxcelsius.com/2009/03/18/reduce-the-size-of-your-xlf-and-swf-files/
    Hope this helps to certain extent.
    Regards
    Nikhil

  • Best-practice for Catalog Views ? :|

    Hello community,
    A best practice question:
    The situtation: I have several product categories (110), several items in those categories (4000) and 300 end-users.    I would like to know which is the best practice for segment the catalog.   I mean, some users should only see categories 10,20 & 30.  Other users only category 80, etc.    The problem is how can I implement this ?
    My first idea is:
    1. Create 110 Procurement Catalogs (1 for every prod.category).   Each catalog should contain only its product category.
    2. Assign in my Org Model, in a user-level all the "catalogs" that the user should access.
    Do you have any idea in order to improve this ?
    Saludos desde Mexico,
    Diego

    Hi,
    Your way of doing will work, but you'll get maintenance issues (to many catalogs, and catalog link to maintain for each user).
    The other way is to built your views in CCM, and assign these views to the users, either on the roles (PFCG) or on the user (SU01). The problem is that with CCM 1.0 this is limitated, cause you'll have to assign one by one the items to each view (no dynamic or mass processes), it has been enhanced in CCM 2.0.
    My advice:
    -Challenge your customer about views, and try to limit the number of views, with for example strategic and non strategic
    -With CCM 1.0 stick to the procurement catalogs, or implement BADIs to assign items to the views (I experienced it, it works, but is quite difficult), but with a limitated number of views
    Good luck.
    Vadim

  • Best practice on sqlite for games?

    Hi Everyone, I'm new to building games/apps, so I apologize if this question is redundant...
    I am developing a couple games for Android/iOS, and was initially using a regular (un-encrypted) sqlite database. I need to populate the database with a lot of info for the games, such as levels, store items, etc. Originally, I was creating the database with SQL Manager (Firefox) and then when I install a game on a device, it would copy that pre-populated database to the device. However, if someone was able to access that app's database, they could feasibly add unlimited coins to their account, unlock every level, etc.
    So I have a few questions:
    First, can someone access that data in an APK/IPA app once downloaded from the app store, or is the method I've been using above secure and good practice?
    Second, is the best solution to go with an encrypted database? I know Adobe Air has the built-in support for that, and I have the perfect article on how to create it (Ten tips for building better Adobe AIR applications | Adobe Developer Connection) but I would like the expert community opinion on this.
    Now, if the answer is to go with encrypted, that's great - but, in doing so, is it possible to still use the copy function at the beginning or do I need to include all of the script to create the database tables and then populate them with everything? That will be quite a bit of script to handle the initial setup, and if the user was to abandon the app halfway through that population, it might mess things up.
    Any thoughts / best practice / recommendations are very appreciated. Thank you!

    I'll just post my own reply to this.
    What I ended up doing, was creating the script that self-creates the database and then populates the tables (as unencrypted... the encryption portion is commented out until store publishing). It's a tremendous amount of code, completely repetitive with the exception of the values I'm entering, but you can't do an insert loop or multi-line insert statement in AIR's SQLite so the best move is to create everything line by line.
    This creates the database, and since it's not encrypted, it can be tested using Firefox's SQLite manager or some other database program. Once you're ready for deployment to the app stores, you simply modify the above set to use encryption instead of the unencrypted method used for testing.
    So far this has worked best for me. If anyone needs some example code, let me know and I can post it.

  • Best Practice Table Creation for Multiple Customers, Weekly/Monthly Sales Data in Multiple Fields

    We have an homegrown Access database originally designed in 2000 that now has an SQL back-end.  The database has not yet been converted to a higher format such as Access 2007 since at least 2 users are still on Access 2003.  It is fine if suggestions
    will only work with Access 2007 or higher.
    I'm trying to determine if our database is the best place to do this or if we should look at another solution.  We have thousands of products each with a single identifier.  There are customers who provide us regular sales reporting for what was
    sold in a given time period -- weekly, monthly, quarterly, yearly time periods being most important.  This reporting may or may not include all of our product identifiers.  The reporting is typically based on calendar-defined timing although we have
    some customers who have their own calendars which may not align to a calendar month or calendar year so recording the time period can be helpful.
    Each customer's sales report can contain anything from 1,000-20,000 rows of products for each report.  Each customer report is different and they typically have between 4-30 columns of data for each product; headers are consistently named.  The
    product identifiers included may vary by customer and even within each report for a customer; the data in the product identifier row changes each week.  Headers include a wide variety of data such as overall on hand, overall on order, unsellable on hand,
    returns, on hand information for each location or customer grouping, sell-through units information for each location or customer grouping for that given time period, sell-through dollars information for each location or customer grouping for that given time
    period,  sell-through units information for each location or customer grouping for a cumulative time period (same thing for dollars), warehouse on hands, warehouse on orders, the customer's unique categorization of our product in their system, the customer's
    current status code for that product, and so on.
    Currently all of this data is stored in a multitude of Excel spreadsheets (by customer, division and time period).  Due to overall volume of information and number of Excel sheets, cross-referencing can take considerable time.  Is it possible to
    set-up tables for our largest customers so I can create queries and pivot tables to more quickly look at sales-related information by category, by specific product(s), by partner, by specific products or categories across partners, by specific products or
    categories across specific weeks/months/years, etc.  We do have a separate product table so only the product identifier or a junction table may be needed to pull in additional information from the product table with queries.  We do need to maintain
    the sales reporting information indefinitely.
    I welcome any suggestions, best practice or resources (books, web, etc).
    Many thanks!

    Currently all of this data is stored in a multitude of Excel spreadsheets (by customer, division and time period).  Due to overall volume of information and number of Excel sheets, cross-referencing can take considerable time.  Is it possible to
    set-up tables .....
    I assume you want to migrate to SQL Server.
    Your best course of action is to hire a professional database designer for a short period like a month.
    Once you have the database, you need to hire a professional DBA to move your current data from Access & Excel into the new SQL Server database.
    Finally you have to hire an SSRS professional to design reports for your company.
    It is also beneficial if the above professionals train your staff while building the new RDBMS.
    Certain senior SQL Server professionals may be able to do all 3 functions in one person: db design, database administration/ETL & business intelligence development (reports).
    Kalman Toth Database & OLAP Architect
    SELECT Video Tutorials 4 Hours
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

Maybe you are looking for

  • How to use getRelativePath() with browseForOpen()? It returns null

    It's quite simple: fTargetFile:File = File.applicationDirectory.resolvePath("assets"); fRootFile:File = File.applicationDirectory.resolvePath("assets"); fTargetFile.addEventListener(Event.SELECT,onSelect) fTargetFile.browseForOpen("test"); <-- here I

  • Adobe Form is not opening after saved the form locally

    Hi Experts, I have implemented the adobe form using webdynpro java using ACF. The scenario is initially the form is loading with input fieIds and dropdown boxes after that the form data has to be submitted into backend. I have encountered the problem

  • Doubts in tuning Hint - ORDERED & APPEND

    Hi, Can you clear my doubts regarding ORDERED and APPEND hints usage in query for tuning? Give some sample sql scripts if at all possible Thanks in advance.

  • How do I view my pics on photos on my iPhone 5s by title, not by date created?

    How do I view my pics on photos on my iphone by title and not by date created? When I uploaded my pics from "Events" in iphoto it all transfered to my iphone. The folders are in order (sorted by numbers), but the content of the folders are arranged b

  • Link is not highlighted 8330

    I'm a realtor so I send out several e-mail with links to view homes from my PC. I Also have the same e-mail forward to me . But when the e-mail comes threw on my blackberry the link is listed but not highlighted for me to open it. Can someone please