Data Element Length - Restriction / Best Practice

Many databases in the market currently allows table/column names to be 128 characters long. Our developers are suggesting that 30-35 characters is the limit one should strive for.
One thought is that Business Objects cannot handle queries bigger than 64K u2013 is this true?  If a query has several filters or results fields that are very long will that be a problem?
Also, is there some potential usage issue / best practice by which using a shorter name would be advantageous in Business Objects?
Our tool stack is SQL Server 2005 / Informatica 8.1.1 / Business Objects XI R2

Hi Jennifer,
I am not sure if the 64K limit is true. However, given that this is the query string that BO is passing to your SQL Server, it would be a limit
The following query is 2k, so you can imagine that a 64k one would be difficult to read
SELECT
     A_Long_Table_Name.TestField1,
     A_Long_Table_Name.TestField12,
     A_Long_Table_Name.TestField22,
     A_Long_Table_Name.TestField32,
     A_Long_Table_Name.TestField42,
     A_Long_Table_Name.TestField52,
     A_Long_Table_Name.TestField62,
     A_Long_Table_Name.TestField72,
     Another_Long_Table_Name.TestField1,
     Another_Long_Table_Name.TestField12,
     Another_Long_Table_Name.TestField13,
     Another_Long_Table_Name.TestField14,
     Another_Long_Table_Name.TestField15,
     Another_Long_Table_Name.TestField16,
     Another_Long_Table_Name.TestField17
FROM
     A_Long_Table_Name,
     Another_Long_Table_Name
WHERE
     A_Long_Table_Name.TestField1 =      Another_Long_Table_Name.TestField1
UNION ALL
SELECT
     A_Long_Table_Name2.TestField1,
     A_Long_Table_Name2.TestField12,
     A_Long_Table_Name2.TestField22,
     A_Long_Table_Name2.TestField32,
     A_Long_Table_Name2.TestField42,
     A_Long_Table_Name2.TestField52,
     A_Long_Table_Name2.TestField62,
     A_Long_Table_Name2.TestField72,
     Another_Long_Table_Name.TestField1,
     Another_Long_Table_Name.TestField12,
     Another_Long_Table_Name.TestField13,
     Another_Long_Table_Name.TestField14,
     Another_Long_Table_Name.TestField15,
     Another_Long_Table_Name.TestField16,
     Another_Long_Table_Name.TestField17
FROM
     A_Long_Table_Name2,
     Another_Long_Table_Name
WHERE
     A_Long_Table_Name2.TestField1 =      Another_Long_Table_Name.TestField1
On the object names, I do not believe you will recieve any noticable difference in performance. It is better to keep the names of columns and tables legible, and your DBAs suggestion of 30-35 is about right.
Regards
Alan

Similar Messages

  • Import data from excel file - best practice in the CQ?

    Hi,
    I have question related to importing data from excel file and creates from those data a table in the CQ page. Is inside CQ some OOTB component which provides this kind of functionalities? Maybe somebody implement this kind of functionality or there is best practice to do this kind of functionalities?
    Thanks in advance for any answer,
    Regards
    kasq

    You can check a working example package [1] (use your Adobe ID to log in)
    After installing it, go to [2] for immediate example.
    Unfortunately it only supports the old OLE-2 Excel format (.xls and not .xlsx)
    [1] - http://dev.day.com/content/packageshare/packages/public/day/cq540/demo/xlstable.html
    [2] - http://localhost:4502/cf#/content/geometrixx/en/company/news/pressreleases/my_personal_bes ts.html

  • Data Migration and Consolidation Best Practices?

    Hi guys
    Do you know what the best practice for data migration to FCSvr is? We’re trying to consolidate all media on various firewire/internal drives to a centralised RAID directly attached to a dedicated server. We’ve found that dragging and dropping a FCP project file uploads its associated media. The problem is that if there are several versions or separate projects linking to the same media, associated media is re-uploaded everytime! This results in that media getting duplicated several times. It appears that the issue is due to FCSvr creating a subfolder for every project file being uploaded which contains all the project’s media.
    This behaviour is not consistent when caching assets, checking out a project file, making changes and checking it back in. FCSvr is quite happy for a project file to link to media existing at the root level of the media device.
    We are of course running the latest version of everything. Hope you can help as we’re pulling our hair out here!
    Regards
    Gavin

    Hi,
    Do you really need an ETL Tool for these loading processes. Have you considered doing it in sql/plsql. If the performance of the application is one of the main priority I would definitely consider doing it in sql/plsql.
    Because of the huge amount of data and because your source and target systems are Oracle DBs I wouldn't recommend you to use Informatica.
    Also because source and target are Oracle DBs and it should be near real time you should have a look at Oracle Streams.
    Regards
    Maurice

  • Data element length modification

    Hi friends,
    We are working on a requirement that requires a change to an existing data element/field.
    This field has been used in various tables/structures/programs and FM's. Essentially, Change to this field requires us to modify lot of programs.
    I am just wondering if there are any alternative approaches. Need your help to evaluate.
    1. The change required is to increase a length of the field to accommodate higher numbers.
    2. Instead of modifying all programs, Are there any alternative approaches?
    Thank you
    Kris

    Make sure to take the following precautions before doing any change
    1. Apply the where use list of that table/ Field and check whether it has been used in some program and FM or not. If yes then check one more thing that check the TYPE of another variables on which system has populating data (move, write or in FM parameters). if you will not consider this then you can land you in big trouble. (Conversion dump)
    2. Ask the basis to take a dump of the production, quality and data for the safer side if something does not go right.
    Now, you can do the changes in your development system and then adjust the database and see the impact of it.
    Hopefully, you will not come across any difficult situation in this changes.

  • HR data transfer toolkit_ SAP Best practice

    Hi All,
    I got to know the New toolkit for HR  Data Transfer in  which  SAP has provided the Built in Excel sheets for the certain Infotypes..I got the document through online. But am not able to use the transactions ..can any one help me out ..from where do i download the HR Data transfer toolkit and if any patches which i have install in the system.
    Can you please help me out in giving me the Links.or steps from where do i donwload the toolkit.
    Advance thanks for the help.
    Regards
    Rajeshwar

    Rajesh,
    You have to request basis to apply the sar file that is specified in best business practise documentation.
    Also note that data transfer tool for ECC 6.0 is different from ECC 5.0 or R/3.
    The data transfer tool is now (in ECC 6) shifted to LSMW.
    regards
    Sridhar

  • Data files available in Best Practice document for BPC

    Any idea how to retrieve data files avaialble under Miscellaneous folder on Best Preactice cd. These files are not available online and don't know how to retrieve the same.  Any help kindly appreciated.
    Sachin

    Download available through swdc.

  • Storing data in a session : best practice

    Hi,
              We are designing a Servlet/JSP based application that has a web-tier
              separate from the middle tier.
              One of our apps have a lot of user inputs, average 500k and upto 2MB of data
              in the request.
              We do not have a way of breaking this application up (i.e the whole 2MB form
              data must be posted at ome time).
              We have 2 solutions and want to know what is the better one and wahy ...
              1. Use session and store all the information in the session.
              2. use Javascript to assemble all the data and submit it at one time.
              I prefer #2 because I don't want to use sessions and also becuase I don't
              want to use a database on the web-tier....
              Please help me explain to my cpollegues who are convinced that we have to
              use sessions to store this data..
              -JJ
              

    Hello,
    WHen you say you want to  load data to other cube, which means one cube holds data for 2 yrs and another for 2 yrs....so they tend to occupy the same table space.
    WHen you say summarized loading, what exactly you mean by that....???
    the data is summarized in the cube ae per the char present in it....so if you reduced the no of char in the second cubes the data will get aggergated on that level giving lesser no of records and occupying less table space.
    Also you can reduce the table space by just compressing the requests in the cube.
    Regards,
    Shashank

  • Increase maktx data element Length in makt table

    Dear Expert,
    Please guide me or provide me a solution.....
    There is a requirement in my company from MM consultant, he wants me to increase the length of maktx field from 40 to 100.
    Please tell me is it possible? if possible then what is the solution and impact?
    Please i am waiting for a reply........
    Regards
    Shelly Malik

    Hi
    @Matt
    You can see similar type of requirement posted today : [Change The length of field MAKTX   |Change The length of field MAKTX;.
    Regards
    Vinod
    Edited by: Vinod Kumar on Jun 25, 2010 11:21 AM

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Item Master Data Best Practice

    hello all
    we are now using SBO for more than a year, and yet we still always add new items in our item master data. what is the best practice on maintaining the item master data. for you to understand this is the scenario. since in the Factory/Mill there are a lot of spare parts and equipments there, if some of this equipments is damage, we have to buy a new one, here the problem occur because if it only differ in Part Numbers we use another item code for it. with this practice, at later part we found out that we have more than 1 item code for only one item because of the naming convention. so we have to hold the other item code and use the other one coz we cant delete it anymore. sometimes 1 itemcode occurrs only once in the in the item history.
    please suggest what is the best Practice on this matter.
    1. Item Grouping
    2. Naming Convention
    etc..
    NOTE:
    our goal is minimize adding of items in item master data.
    FIDEL

    FIDEL,
    From what I understand, you have to replace broken / damaged component of an item like Bulldozer, Payloader and mill turbines.  This is the reason why you defined the parts as a new item.
    From your Item code examples, I am not clear why you have 2 different names for the same item.  and also what you mean by "this two item codes are actually the same,
    If you are just buying parts to replace components and if you do not need to track them then I would suggest you create generic itemcodes in the Item master and simply change the description when you buy / sell them.
    Example:  Same Item different description.
    REPL101  OIL FILTER
    REPL101  FUEL FILTER
    REPL101  xxxxx
    This way you are not going to keep creating items in the database and also you can see the description and know what it was.
    Simply change the ItemName in the marketing document and instead of pressing Tab to move to the next column Press CTRL+Tab so that SAP does not auto check then ewly typed name against the item master.
    Let me know if your scnenario is otherwise
    Suda

  • Why CHAR DATA ELMNT LENGTH allowed is different in SAP and BI?

    Hi All
      Why the Character Data Element Length of Text is  different in SAP R3 and BI though the product belong to SAP??
    Any ideas to solve this issue to get long text or any workaround in Query??
    how do I find the table to fetch using FM READ_TEXT?

    Hi,
    Work Around for more than 60 char length:
    Check Blogs:
    /people/marc.bernard/blog/2009/08/07/the-60-character-restriction-for-bw-data-models
    Reason for 60Char restiction-Exact reason not known.
    - Performance is probably an issue.
    - You don't really need longer InfoObjects, they just thought it is enough.
    - Longer texts would have been fine but still not necessary.
    Hope it helps.
    Thanks and Regards,
    MuraliManohar.

  • Data element status showing Modified / Active

    Hi Gurus,
    created one custom field using EEW in WEB UI, that filed length and type is NUMC,2, my requirement is need to change the length from 2 to 3.So i am not able to change the length from WEB UI ,doing changes manually from SE11. i have chaged the particular Data element length to 3 and activated Data element the Data element status showing Modified /Active instead of Active.
    i have checked SE09 .showing inactive objects. please help me on this how to come out from this problem.how to activate Data element.
    Thanks & Regards,
    Sunil B

    Hi Sanly yan,
    i have done table Adjustment in se14.
    Thanks & Regards,
    Sunil B

  • Change VTTK-TEXT1 Data Element Display text

    Hi All,
    Can any one please tell me is there any way that i can change the Display Text of the Data Element VTTK_TEXT1 without getting an access Key from SAP .
    Thanks ,
    Dilum

    Hi Dilum,
    You can do this using TEXT-EXITS
    TOCDE: CMOD
    GOTO---> TEXT ENHANCEMENT--->KEYWORD----> CHANGE( Give Data Element name )    SAVE
    Best Regards,
    Pravin

  • Unicode Migration using National Characterset data types - Best Practice ?

    I know that Oracle discourages the use of the national characterset and national characterset data types(NCHAR, NVARCHAR) but that is the route my company has decide to take and I would like to know what is the best practice regarding this specifically in relation to stored procedures.
    The database schema is being converted by changing all CHAR, VARCHAR and CLOB data types to NCHAR, NVARCHAR and NCLOB data types respectively and I would appreciate any suggestions regarding the changes that need to be made to stored procedures and if there are any hard and fast rules that need to be followed.
    Specific questions that I have are :
    1. Do CHAR and VARCHAR parameters need to be changed to NCHAR and NVARCHAR types ?
    2. Do CHAR and VARCHAR variables need to be changed to NCHAR and NVARCHAR types ?
    3. Do string literals need to be prefixed with 'N' in all cases ? e.g.
    in variable assignments - v_module_name := N'ABCD'
    in variable comparisons - IF v_sp_access_mode = N'DL'
    in calls to other procedures passing string parameters - proc_xyz(v_module_name, N'String Parameter')
    in database column comparisons - WHERE COLUMN_XYZ = N'ABCD'
    If anybody has been through a similar exercise, please share your experience and point out any additional changes that may be required in other areas.
    Database details are as follows and the application is written in COBOL and this is also being changed to be Unicode compliant:
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    NLS_CHARACTERSET = WE8MSWIN1252
    NLS_NCHAR_CHARACTERSET = AL16UTF16

    ##1. while doing a test convertion I discovered that VARCHAR paramaters need to be changed to NVARCHAR2 and not VARCHAR2, same for VARCHAR variables.
    VARCHAR columns/parameters/variables should not by used as Oracle reserves the right to change their semantics in the future. You should use VARCHAR2/NVARCHAR2.
    ##3. Not sure I understand, are you saying that unicode columns(NVARCHAR2, NCHAR) in the database will only be able to store character strings made up from WE8MSWIN1252 characters ?
    No, I meant literals. You cannot include non-WE8MSWIN1252 characters into a literal. Actually, you can include them under certain conditions but they will be transformed to an escaped form. See also the UNISTR function.
    ## Reason given for going down this route is that our application works with SQL Server and Oracle and this was the best option
    ## to keep the code/schemas consistent between the two databases
    First, you have to keep two sets of scripts anyway because syntax of DDL is different between SQL Server and Oracle. There is therefore little benefit of just keeping the data type names the same while so many things need to be different. If I designed your system, I would use a DB-agnostic object repository and a script generator to produce either SQL Server or Oracle scripts with the appropriate data types or at least I would use some placeholder syntax to replace placeholders with appropriate data types per target system in the application installer.
    ## I don't know if it is possible to create a database in SQL Server with a Unicode characterset/collation like you can in Oracle, that would have been the better option.
    I am not an SQL Server expert but I think VARCHAR data types are restricted to Windows ANSI code pages and those do not include Unicode.
    -- Sergiusz

Maybe you are looking for

  • What is SLD .. why we go for SLD.. what is the  the uses of thisSLD

    Hi friends,      can you give me the brief idea about SLD... first  of all WHAT is SLD.. WHY we go for SLD.. and what are the Uses of this SLD..    plz calrrify my doubt.. Thanks Babu

  • Content type for soap messages

    Hi, i'm picking up a file via nfs, taking the content of the file and i'm trying to send the content via SOAP to a Webservice. i'm encountering the foloowing error message in the comm. channel monitoring: SOAP: call failed: java.io.IOException: inval

  • IOS 5.1 Upgrade caused Mute Problems

    Has anyone upgraded to the latest iOS (5.1) and then experienced problems? My conversations are muted (one way – out) since upgrading to 5.1 last week. I'll be mid-conversation and the other party will say "I can't hear you, where did you go?". If I

  • Business area vs document split

    Dear All, Would like to know where i can get a comprehensive analysis between implementing functionality of business area vs document split. Our clinet want to know the pros & cons for the same. Thanks SMK Ganesh

  • Download / Upload objects of transport request

    Hi, Is there any standard prog or utility to download to upload objects in a transport request or is there any utility to download or upload all programs in a given package. Thanks