From where to get "First day of the week" data for all the locales, is it present in CLDR spec 24?

I am trying to get "First day of the week" data from CLDR spec24 but cannot find where to look for it in the spec. I need this data to calculate numeric value of "LOCAL day of the week".
This data to implement "c" and "cc" day formats that equals numeric local day of the week.
e.g if "First day of the week" data for a locale is 2 (Monday) , it means numeric value for local day of the week will be 1 if it is Monday that day, 2 if it is Tuesday that day and likewise.

Hi
If you want to week to be started with Sunday then use the following formula:
TimestampAdd(SQL_TSI_DAY, 1-DAYOFWEEK(Date'@{var_Date}'), Date'@{var_Date}') if it's retail week(starts from Monday) then the follow below:
TimestampAdd(SQL_TSI_DAY, 1-DAYOFWEEK(Date'@{var_Date}'), Date'@{var_Date}')
I'm assuming var_Date is the presentation variable for prompt...
Edited by: Kishore Guggilla on Jan 3, 2011 4:48 PM

Similar Messages

  • I am having problems with the month of October.  When I have the full month view, the synced items from google calendars won't show.  But they do for all the other months.

    I am having problems with the month of October.  When I have the full month view, the synced items from google calendars won't show.  But they do for all the other months.  There is definitely a glitch somewhere because if I am on day view and try to click on day 15 of October, it will not let me.  It totally is acting weird.

    Free fonts, fonts from a reliable foundry?
    Have you tried removing them temporarily?
    Do you still have problems if you create a new Windows user account and log in with that? How about if you start in Safe Mode?
    And I would run a memory checker and see Troubleshoot font problems | Windows

  • I managed to transfer the music from 1 pc to 1 other pc (to be played by iTunes). Bit didn't fixe the transfer of the meta data (for example the rating). Does anyone know how to use your old playlists, etc..(= meta data) on the other PC? So, please help!

    I managed to transfer the music from 1 pc to 1 other pc (it is played perfectly on the other PC (Windows 7- pc) by the newly downloaded and installed iTunes, version 10.6.1.7). But I didn’t fix the transfer of the meta data (for example the rating). Does anyone know how to use (or import) your old playlists, etc..(= meta data) on the other PC? So, short: how to migrate the meta data from one pc to an other pc?
    It's a pity that it seems not to be possible to import on the new pc e.g. the ratings!!!
    I don't want to start again with rating my (beautiful classical) music!
    Thanks, from Amsterdam, NL

    Did you do the move via Home Sharing, astro?
    If so, perhaps try the instructions from the following post:
    Re: i transfered itunes to my pc playlists did not go

  • How to generate test data for all the tables in oracle

    I am planning to use plsql to generate the test data in all the tables in schema, schema name is given as input parameters, min records in master table, min records in child table. data should be consistent in the columns which are used for constraints i.e. using same column value..
    planning to implement something like
    execute sp_schema_data_gen (schemaname, minrecinmstrtbl, minrecsforchildtable);
    schemaname = owner,
    minrecinmstrtbl= minimum records to insert into each parent table,
    minrecsforchildtable = minimum records to enter into each child table of a each master table;
    all_tables where owner= schemaname;
    all_tab_columns and all_constrains - where owner =schemaname;
    using dbms_random pkg.
    is anyone have better idea to do this.. is this functionality already there in oracle db?

    Ah, damorgan, data, test data, metadata and table-driven processes. Love the stuff!
    There are two approaches you can take with this. I'll mention both and then ask which
    one you think you would find most useful for your requirements.
    One approach I would call the generic bottom-up approach which is the one I think you
    are referring to.
    This system is a generic test data generator. It isn't designed to generate data for any
    particular existing table or application but is the general case solution.
    Building on damorgan's advice define the basic hierarchy: table collection, tables, data; so start at the data level.
    1. Identify/document the data types that you need to support. Start small (NUMBER, VARCHAR2, DATE) and add as you go along
    2. For each data type identify the functionality and attributes that you need. For instance for VARCHAR2
    a. min length - the minimum length to generate
    b. max length - the maximum length
    c. prefix - a prefix for the generated data; e.g. for an address field you might want a 'add1' prefix
    d. suffix - a suffix for the generated data; see prefix
    e. whether to generate NULLs
    3. For NUMBER you will probably want at least precision and scale but might want minimum and maximum values or even min/max precision,
    min/max scale.
    4. store the attribute combinations in Oracle tables
    5. build functionality for each data type that can create the range and type of data that you need. These functions should take parameters that can be used to control the attributes and the amount of data generated.
    6. At the table level you will need business rules that control how the different columns of the table relate to each other. For example, for ADDRESS information your business rule might be that ADDRESS1, CITY, STATE, ZIP are required and ADDRESS2 is optional.
    7. Add table-level processes, driven by the saved metadata, that can generate data at the record level by leveraging the data type functionality you have built previously.
    8. Then add the metadata, business rules and functionality to control the TABLE-TO-TABLE relationships; that is, the data model. You need the same DETPNO values in the SCOTT.EMP table that exist in the SCOTT.DEPT table.
    The second approach I have used more often. I would it call the top-down approach and I use
    it when test data is needed for an existing system. The main use case here is to avoid
    having to copy production data to QA, TEST or DEV environments.
    QA people want to test with data that they are familiar with: names, companies, code values.
    I've found they aren't often fond of random character strings for names of things.
    The second approach I use for mature systems where there is already plenty of data to choose from.
    It involves selecting subsets of data from each of the existing tables and saving that data in a
    set of test tables. This data can then be used for regression testing and for automated unit testing of
    existing functionality and functionality that is being developed.
    QA can use data they are already familiar with and can test the application (GUI?) interface on that
    data to see if they get the expected changes.
    For each table to be tested (e.g. DEPT) I create two test system tables. A BEFORE table and an EXPECTED table.
    1. DEPT_TEST_BEFORE
         This table has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look BEFORE the
         test for that test case is performed.
         CREATE TABLE DEPT_TEST_BEFORE
         TESTCASE NUMBER,
         DEPTNO NUMBER(2),
         DNAME VARCHAR2(14 BYTE),
         LOC VARCHAR2(13 BYTE)
    2. DEPT_TEST_EXPECTED
         This table also has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look AFTER the
         test for that test case is performed.
    Each of these tables are a mirror image of the actual application table with one new column
    added that contains a value representing the TESTCASE_NUMBER.
    To create test case #3 identify or create the DEPT records you want to use for test case #3.
    Insert these records into DEPT_TEST_BEFORE:
         INSERT INTO DEPT_TEST_BEFORE
         SELECT 3, D.* FROM DEPT D where DEPNO = 20
    Insert records for test case #3 into DEPT_TEST_EXPECTED that show the rows as they should
    look after test #3 is run. For example, if test #3 creates one new record add all the
    records fro the BEFORE data set and add a new one for the new record.
    When you want to run TESTCASE_ONE the process is basically (ignore for this illustration that
    there is a foreign key betwee DEPT and EMP):
    1. delete the records from SCOTT.DEPT that correspond to test case #3 DEPT records.
              DELETE FROM DEPT
              WHERE DEPTNO IN (SELECT DEPTNO FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3);
    2. insert the test data set records for SCOTT.DEPT for test case #3.
              INSERT INTO DEPT
              SELECT DEPTNO, DNAME, LOC FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3;
    3 perform the test.
    4. compare the actual results with the expected results.
         This is done by a function that compares the records in DEPT with the records
         in DEPT_TEST_EXPECTED for test #3.
         I usually store these results in yet another table or just report them out.
    5. Report out the differences.
    This second approach uses data the users (QA) are already familiar with, is scaleable and
    is easy to add new data that meets business requirements.
    It is also easy to automatically generate the necessary tables and test setup/breakdown
    using a table-driven metadata approach. Adding a new test table is as easy as calling
    a stored procedure; the procedure can generate the DDL or create the actual tables needed
    for the BEFORE and AFTER snapshots.
    The main disadvantage is that existing data will almost never cover the corner cases.
    But you can add data for these. By corner cases I mean data that defines the limits
    for a data type: a VARCHAR2(30) name field should have at least one test record that
    has a name that is 30 characters long.
    Which of these approaches makes the most sense for you?

  • How to update the condition price in the sales order for all the items

    Hi,
    How to update the condition price for all the itmes in the sales order to carry out the new price automatically through a stand alone program, for all the orders in the billing due list table?
    Thanks,
    Balaram

    Hi,
    There is a change in the requirement.
    Scenario:
    I have created a sales order with some 4 condition types, in that 2 condition types are of class A & B and the other two is of class C. Here I need to update the condition price of class A & B only and the remaining condition types should not get update even though there is an updated price is available.
    For the above scenario, I need to write a standalone program. Do we have any function modules to update the price of the single condition in the sales order? Please tell me how we can update the sales order at item condition level.
    Thanks.
    Balaram

  • Change the Page Title for all the pages when viewing the Reports in Bi Publisher

    Hello ,
    I have a requirement for changing the Page Title for all the report displayed pages on the browser window.

    As far as I know, answer is NO.
    it's very simple answer  Manikandan-S-Oracle
    for 1210326 's requirements there are some places for changes of property "title" and in some pages "title" can be different
    "one place or option for changing" does not exist imho ( may be i'm wrong )
    so you can change places for your needs but not sure what it's adequate
    as example
    ...wls\user_projects\applications\bipdomain\xmlpserver\catalog\navigator.jsp  -->  find <title>
    it changes title in http://somehost:7001/xmlpserver/servlet/catalog
    ... wls\user_projects\applications\bipdomain\xmlpserver\home\home.jsp  -->  find <title>
    it changes title in http://somehost:7001/xmlpserver/servlet/home

  • Org Management - How can I change the start date of all the org units

    How can I change the start date of all the org unit objects and its relationships at one shot ?  I have created some 100 Org units with the wrong start date. Please suggest.
    Thanks in advance..
    Cheers
    Vijay

    Hello Archana,
    Thanks a lot... This has solved my issue...
    Cheers
    Vijay

  • Why is the ios6 compatible for all the iphones but not all the phones got the all upgrades, for example the panorama camera??

    why is the ios6 compatible for all the iphones but not all the phones got the all upgrades, for example the panorama camera?? this is another reason I am starting to dislike apple. An ipod is able to us an upgrade but not all the phones are? makes sense

    "why is the ios6 compatible for all the iphones"
    Ios 6 is not available for all iphones. Original Iphone and iphone 3 are not supported.
    "but not all the phones got the all upgrades, "
    This is common for most any device/computer.  Not all iphone/computer/device models can handle all of the new features.  Like any device, hardware is limited and cannot always support all new features and software.

  • Can I adjust all the master gains for all the keyframes at once?

    I edited a whole scene and keyframed some corrections but I wanted to change a few settings for the keyframes such as master gain. Is there anyway to fix this so I can just adjust all the master gians for all the keyframes at once? Thanks!

    If its the vignette geometry that is keyframed, and not the corrections, as written, (there is a distinction) then the color values are presumably NOT keyframed, as the keyframe only applies to that room/component of the grade, and not the whole grade. Your Primary In correction, in this case, is NOT keyframed and all the normal rules for revising corrections apply.
    Apple COLOR has a systemic definition: a change in values for a ROOM is a "correction", and the accumulation of corrections for a clips is a 'grade'.
    Keyframes only govern the behaviour of the room that they are created in, and don't apply across the board to the whole grade. The advantage is that the corrections can act independently.. the disadvantage is that the grade itself cannot be copy-and-pasted into another keyframed clip without any preparations.
    There are ways of doing that that are not in the Manual.
    I concur with your view of tracking, and you do have to do a wetware estimate to decide whether the investment is worth it... maybe a start and end keyframe is all you need, and adjust the trajectory as required.
    Wetware= that 'goo' between most people's ears, that starts right behind their eyes.
    jPo

  • How to get first day of calendar week ?

    hi,
    i'm storing some calendar weeks in my database. for example 30/2013.
    How can i get the first day of this week (monday)? in this case it is 22.07.2013
    thx
    Anja

    As far as I know we don't have a function that converts week of a year into date. So you can try something like this.
    SQL> with t
      2  as
      3  (
      4  select '30/2013' week_year
      5    from dual
      6  )
      7  select min(dt) week_start_day
      8    from (
      9           select (level - 1) + to_date('01-01-' || substr(week_year, -4), 'dd-mm-yyyy') dt
    10                ,  week_year
    11             from t
    12          connect
    13               by level <=  to_date('31-12-' || substr(week_year, -4), 'dd-mm-yyyy') -
    14                            to_date('01-01-' || substr(week_year, -4), 'dd-mm-yyyy') +1
    15         )
    16   where to_number(to_char(dt, 'iw')) = to_number(substr(week_year, 1, instr(week_year, '/')-1));
    WEEK_STAR
    22-JUL-13

  • DCIteratorBinding setting the same value for all the rows.

    Hi all,
    I have table with on of the column as id. I have made the id column as the hyper link , that takes me to the next page. I am trying to pass the id to the next page. I am using the managed bean for the same. The below code is used to create the link on the column. It sets the action to the function in the bean that sets the id as of the current row.
    <tr:column sortProperty="id" sortable="false"
    headerText="#{bindings.notification.hints.id.label}"
    id="c4">
    <tr:commandLink action="choice" text="#{row.bindings.id.inputValue}"
    id="cl1" actionListener="#{IdBean.extractID}">
    </tr:commandLink>
    </tr:column>
    below is the IdBean.extratctID()
    public void extractID(ActionEvent actionEvent){       
    BindingContext bindingContext = BindingContext.getCurrent();
    BindingContainer bindings;
    bindings = bindingContext.getCurrentBindingsEntry();
    DCIteratorBinding iter;
    iter = (DCIteratorBinding) bindings.get("notificationIterator");
    Row rw;
    rw = iter.getCurrentRow();
    String id;
    id = (String) rw.getAttribute("id");
    this.setId2(id);
    I am printing the id value on to the next page. But its just returning the vslue of the id for the first row for all the rows of the table.
    Any inputs as in do i need to refresh the iterator or something like that.
    Reagrds
    Sishant

    Hi,
    Following is the code i have added in my bean
    ValueExpression expression = getFacesContext().getApplication().getExpressionFactory().createValueExpression(getFacesContext().getELContext(), "#{pageFlowScope.emp1}", Object.class);
    Object id = expression.getValue(getFacesContext().getELContext());
    public FacesContext getFacesContext() {
    return FacesContext.getCurrentInstance();
    JSPX Code -
    <af:commandImageLink id="DuncanAngove"
    icon="/john.gif" partialSubmit="true"
    actionListener="#{Tweets.setEmployeeId}">
    <af:setActionListener from="Duncan Angove" to="#{pageFlowScope.emp1}" />
    But i am getting NullPointerException. I have tried it with application and request scope as well.
    javax.servlet.ServletException
         at javax.faces.webapp.FacesServlet.service(FacesServlet.java:277)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:191)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:97)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:420)
         at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:420)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:247)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:157)
         at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:94)
         at java.security.AccessController.doPrivileged(Native Method)
         at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
         at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:414)
         at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:138)
         at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.dms.wls.DMSServletFilter.doFilter(DMSServletFilter.java:330)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.doIt(WebAppServletContext.java:3684)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3650)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2268)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2174)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1446)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    Caused by: java.lang.NullPointerException
         at org.apache.myfaces.trinidad.component.UIXComponentBase.getValueExpression(UIXComponentBase.java:231)
         at project1.Tweets.setEmployeeId(Tweets.java:40)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at com.sun.el.parser.AstValue.invoke(AstValue.java:157)
         at com.sun.el.MethodExpressionImpl.invoke(MethodExpressionImpl.java:283)
         at org.apache.myfaces.trinidadinternal.taglib.util.MethodExpressionMethodBinding.invoke(MethodExpressionMethodBinding.java:53)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.broadcastToMethodBinding(UIXComponentBase.java:1259)
         at org.apache.myfaces.trinidad.component.UIXCommand.broadcast(UIXCommand.java:183)
         at javax.faces.component.UIViewRoot.broadcastEvents(UIViewRoot.java:475)
         at javax.faces.component.UIViewRoot.processApplication(UIViewRoot.java:756)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._invokeApplication(LifecycleImpl.java:698)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:285)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:177)
         at javax.faces.webapp.FacesServlet.service(FacesServlet.java:265)
         ... 35 more
    Thanks
    Sishant

  • Pulling the Header text  for all the purchase orders

    Hello,
    Please help in resolving the below:
    Is there any code in SAp to pull theHeader text for all the purchase orders in the data base,
    Any tcode to pull the specific text automatically to pull the text from the requsition ( Plant wise in sourcing).
    Thanks

    STXH Text header 
    STXL Text detail
    Chk the below message thread
    Table to read PO Header text and item text
    BR,
    Krishna

  • Defaulting the Sotrage location for all the line items in my sales order

    Hi,
    I have a scenario to default the storage location automatically in my sales order for all the line items. It is not going to change for a particular sales org. Dist. channel & Division combination. Can you pls. suggest me a solution for this.
    Thanks
    Ghanesh.

    Hi,
       You can default the storage location by using the below user exit
    Storage location
    Auto determination of storage location as u2018XXXXu2019 for sales order
    Include - MV45AFZB
    Form - USEREXIT_SOURCE_DETERMINATION
    IF (VBAK-AUART = 'XXXX' and VBAK-VKORG = 'XXXX' and VBAK-VTWEG = 'XX').
    VBAP-LGORT   = 'ABCD'.
    Regards,
    Gopal.
    Edited by: Gopalakrishnan S on Feb 25, 2010 7:35 AM

  • List the storage location for all the plants

    Hello
    How could I list all the storage locations of all the plants in single view.
    I have tried V_T001L but it is go plant by plant level.
    rgds
    Nile.Y

    Hello,
    Following link may help you.
    List of Storage Locations by Plant
    BR,
    Tushar

  • GL Account Master Data for all the inventory Accounts

    Hi,
    What is common & unique feature in the GL Master data of all the Inventory related accounts? Is it "POST AUTOMATICALLY" or some thing else
    Thanks,
    Lavanya

    In addition to the 'post automatically' option, please also ensure the field status group selected is relating to 'material accounts'.

Maybe you are looking for