Records according to a model of data

Hi,
I would like to do a query whose result returns recordings with a field is according to a model (or a mask) of data.
The data must have 4 letters and 4 digits (example "OPTI1457").
Do you know how I can do that ?
Thank you for your help.
Patrick

How about this then?
SELECT t.*
  FROM (SELECT 'OPTI1457' col1
          FROM dual
         UNION
        SELECT 'OPT123'
          FROM dual
         UNION
        SELECT 'OPTIM12345'
          FROM dual
        ) t
WHERE TRIM(TRANSLATE(SUBSTR(t.col1, 1, 4), 'ABCDEFGHIJKLMNOPQRSTUVWXYZ', ' ')) IS NULL
   AND TRIM(TRANSLATE(SUBSTR(t.col1, 5, 4), '0123456789', ' ')) IS NULL
;      C.

Similar Messages

  • Display records according month

    hi all.
    i want to display records according months.
    i created view something like this.
    create or replace view month_timer (item1, amount,item_date) as
    select item1,sum(amount),item_date
    from table
    where trunc(item_date) between to_date('01-jan-2010') and trunc(sysdate)
    AND item_TYPE!=4
    group by item1,item_date
    ORDER BY item_dateany suggestions?
    sarah

    create or replace view month_timer (item1, amount,item_date) as here in this statement you are using item1 and only tree columns.
    select pcd_plt_id,pcd_cbr_id ,sum(amount),item_date And here in this statement there are four columns. So which column is for item1
    group by item1,item_dateHere you can use only selected columns in group by. But you are using item1 which you used above for view.
    And in select there are three columns without aggregate function. And you are using only two here.
    If you could post the table data and desired output. Then it will be easier to find better way.
    -Ammad

  • Modeling transaction data

    Hello all,
    I have 2 questions that I was hoping to get an answer to:
    Question 1:
    What is the normal way of modeling transaction data with a changing status in a BW system? Are there any links/threads to read?
    I thought that transaction  idata would go into the DSO that any changes to transaction data would be recorded there (very granular) while the aggregation of data would be placed in the cube.
    Question 2:
    For what reason would someone place a navigation attribute in the dimension of a cube?
    TIA
    PS - this is for BI 7.0
    Edited by: Casey Harris on Feb 4, 2008 10:15 PM

    Casey,
    A couple of quick answers that aren't links:
    1)  Ideally BW 7.0 allows you to create an Enterprise Data Warehouse (EDW), where you have granular data loaded to DSO's that then aggregate the data into cubes.    That is what we strive for.  In practice it doesn't always work out that way.  Do some searches on EDW and you should find some info.
    2)  Navigation attributes are essentially links to master data attributes.  By not putting them directly in a cube, you save a little bit of space in the cube.  The most common use of them that we have is when users tell us they want to filter on a field that is not directly in a cube, but is in the master data attributes.  We can then easily make that field a navigation attribute.  Otherwise if you wanted to add the field to the cube, you'd have to reload all the data, which can be quite painful.
    Michael

  • Differences between operational systems data modeling and data warehouse da

    Hello Everyone,
    Can anybody help me understand the differences between operational systems data modeling and data warehouse data modeling>
    Thanks

    Hello A S!
    What you mean is the difference between modelling after normal form like in operational systems (OLTP) e. g. 3NF and modelling a InfoCube in a data warehouse (OLAP)?
    While in a OLTP you want to have data tables free of redundance and ready for transactions meaning writing and reading few records often, in an OLAP-system you need to read a lot of data for every query you do on a database. Often in an OLAP-system you aggregate these amounts of data.
    Therefore you use another principle for these database scheme. This is called star schema. This means that you have one central table (called fact table) which holds the key figures and have keys to another tables with characteristics. These other tables are called dimension tables. They hold combinations of the characteristics. Normally you design it that your dimensions are small, so the access  on the data is more efficent.
    the star scheme in SAP BI is a little more complex than explained here but it follows the same concept.
    Best regards,
    Peter

  • Power view couldn't load the model or data source because the data source type is not supported

    Hi,
    I have SQL 2012 standard edition in my local. I have developed SSAS & deployed in local. I have been asked to develop power view report in excel 2013 using this SSAS. But when I tried to do in Excel 2013 professional Plus, I am getting below error:
    Power view couldn't load the model or data source because the data source type is not supported.
    Does power view is supported in standard edition of SQL or it requires Business/Enterprise edition of SQL server?
    Thanks in advance

    What type of SSAS install are you using?
    PowerView in Excel 2013 currently only supports Tabular data sources.
    Only PowerView in Sharepoint 2013 supports both Tabular and Multi-Dim data sources. (provided you have the required Sharepoint and SQL updates installed)
    http://darren.gosbell.com - please mark correct answers

  • Loading ODS - Data record exists in duplicate within loaded data

    BI Experts,
    I am attemping to load an ODS with the Unique Data Records flag turned ON.  The flat file I am loading is a crosswalk with four fields, the first 3 fields are being used as Key Fields in order to make the records unique.  I have had this issue before, but gave up in frustration and added an ascending number count field so simply create a unique key.  This time I would like to solve the problem if possible.
    The errors come back referring to two data rows that are duplicate:
    Data record 1 - Request / Data package / Data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 339
    Data record 2 - Request / data package / data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 338
    And below here are the two records that the error message refers to:
    3     338     3902301480     19C*     *     J1JD     
    3     339     3902301510     19C*     *     J1Q5     
    As you can see, the combination of my three Key Fields should not be creating a duplicate. (3902301480, 19C(asterisk) , (asterisk))   and (3902301510, 19C(asterisk) , (asterisk))  I replaced the *'s because they turn bold!
    Is there something off with the numbering of the data records?  Am I looking in the wrong place?  I have examined my flat file and can not find duplicates and the records that BW say are duplicates are not, I am really having a hard time with this - any and all help greatly appreciated!!!

    Thank you for the response Sabuj....
    I was about to answer your questions but I wanted to try one more thing, and it actually worked.  I simply moved the MOST unique Key Field to the TOP of my Key Field list. It was at the bottom before.
    FYI for other people with this issue -
    Apparantly the ORDER of your Key Fields is important when trying to avoid creating duplicate records.
    I am using four data fields, and was using three data fields as the Key Fields.  Any combination of all three would NOT have a duplicate, however when BW finds that the first two key fields match, sometimes it apparantly doesn't care about the third one which would make the row unique.  By simply changing the order of my Key Fields I was able to stop getting the duplicate row errors...
    Lesson - If you KNOW that your records are unique, and you are STILL getting errors for duplicates, try changing the ORDER of your key fields.

  • How to get master data records that do not have transaction data in a query

    Hi,
    How to get master data records that do not have transaction data in a query output. Can we create a query or any other way to get the master data records that do not have transaction data?

    Hi,
    Create a multiprovider which includes transactional data target and master data info object. Make sure that identification for this master data info object is ticked on both the provider.
    Create report on this multiprovider , keep the master data info object in rows , and now you should able to see all the values which are there in master data info object irrespective of transaction happened or not .
    Next you may create condition showing only zero keyfigure values , ie. master data without any transaction.
    Hope that helps.
    Regards
    Mr Kapadia

  • Data Models and Data Flow diagrams.

    Hi  Gurus,
        Can anybody brief me the concept of Data Models and Data Flow Diagrams and their development with illustrations. And is it a responsibility of a Technical or a Functional consultant..i.e to translate Business requirements and functional specifications into technical specifications, data flow diagrams and data models.
    Your valuable answers will be rewarded.
    Thanks in advance.

    Hi,
    Concept of Data Models
    Data model or Data modelling is basically how you define or design your BW Architecture based on Business requirements. It deals with designing and creating a effcient BW architecture sticking to standard practices.
    Multi-Dimensional Modeling with SAP NetWeaver BI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/6ce7b0a4-0b01-0010-52ac-a6e813c35a84
    /people/githen.ronney3/blog/2008/02/13/modeling-strategies
    Modeling the Data Warehouse Layer with BI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3668618d-0c01-0010-1ab5-aa75c3a4dfc2
    /people/gilad.weinbach2/blog/2007/02/23/a-beginners-guide-to-your-first-bi-model-in-nw2004s
    Data Flow Diagrams
    This show the path of data flow for each individual object in BW. How data dets loaded into that object and how it is going out of the object etc.
    Right click on the data target > show data flow .
    It shows all the intermdeiate layer through which data comes into that particular object.
    Responsibility of a Technical or a Functional consultant
    This is done genrally in the designing phase itself by a Senior Technical Consultant with the help of a Functional consultant or a Techno=Functional consultant interacting with Business.
    Hope this helps.
    Thanks,
    JituK

  • To search  the records according to certain criteria

    Hello,
    I am a user of jdeveloper using swing/jclient for bc4j.
    How to search the records according to certain specified criteria i.e if user enter c in the textbox then all the names starting with the letter c matches the criteria.
    it is somewhat same as (like c%) in oracle .
    if there is anyone who can help me then please do send reply back to me.
    thank you

    you need to use the setWhereClause method on the BC4J view to set it to your search criteria.
    Do a search for this methos on OTN or google and you'll find several samples.
    Like this one:
    http://otn.oracle.com/products/jdev/htdocs/handson/WebServices/HOS903WS.html

  • Get only one record for an id for a date if multiple record exists

    Hi,
    I need help with below mentioned scenario.
    DB Version: 11.2.0.3.0.
    Requirement:
    Fetch account records that were created after last run of program
    Get latest record for an account on a given date if there are multiple records for same account.
    If there is a gap of more than 1 day from last run of program, then get latest record for an account for each date if there are multiple records for same account.
    Create table t_test
      Id           number not null,
      cust_id      number not null,
      cust_Acct_id number not null,
      ins_date     date   not null
    insert into t_test values
    (1, 12345, 678905, to_date('05/31/2012 12:05:10 PM','MM/DD/YYYY HH:MI:SS PM'));
    insert into t_test values
    (2, 12345, 678905, to_date('05/31/2012 05:25:46 PM','MM/DD/YYYY HH:MI:SS PM'));
    insert into t_test values
    (3, 12345, 678905, to_date('05/31/2012 11:48:00 PM','MM/DD/YYYY HH:MI:SS PM'));
    insert into t_test values
    (4, 12345, 678905, to_date('06/01/2012 12:05:10 PM','MM/DD/YYYY HH:MI:SS PM'));
    insert into t_test values
    (5, 12345, 678905, to_date('06/01/2012 05:25:46 PM','MM/DD/YYYY HH:MI:SS PM'));
    insert into t_test values
    (6, 12345, 678905, to_date('06/01/2012 11:48:00 PM','MM/DD/YYYY HH:MI:SS PM'));
    insert into t_test values
    (7, 12345, 678905, to_date('06/02/2012 12:05:10 PM','MM/DD/YYYY HH:MI:SS PM'));
    insert into t_test values
    (8, 12345, 678905, to_date('06/02/2012 05:25:46 PM','MM/DD/YYYY HH:MI:SS PM'));
    insert into t_test values
    (9, 12345, 678905, to_date('06/02/2012 11:48:00 PM','MM/DD/YYYY HH:MI:SS PM'));
    create table t_log
      id            number not null,
      prgrm_id      number not null,
      last_run_date date   not null
    insert into t_log values
    (1,1009,to_date('5/30/2012 07:05:12 AM','MM/DD/YYYY HH:MI:SS PM'));Result required:
    id cust_id cust_acct_id ins_date
    3 12345 678905 '05/31/2012 11:48:00 PM'
    6 12345 678905 '06/01/2012 11:48:00 PM'
    9 12345 678905 '06/02/2012 11:48:00 PM'
    I tried below sql but it will return only id 9 record.
    select
        id,
        cust_id,
        cust_acct_id,
        ins_date
    from
        select
            id,
            cust_id,
            cust_acct_id,
            ins_date,
            row_number() over (partition by cust_acct_id order by ins_date desc) rn
        from
            t_test t
        where
            t.ins_date > (
                          select
                              last_run_date
                          from
                              t_log l
                          where
                              l.prgrm_id = 1009
    where rn = 1;Thanks in advance.

    Try:
    SQL> select
      2      id,
      3      cust_id,
      4      cust_acct_id,
      5      ins_date
      6  from
      7      (   
      8      select
      9          t.id,
    10          t.cust_id,
    11          t.cust_acct_id,
    12          t.ins_date,
    13          row_number() over (partition by cust_acct_id, trunc(ins_date) order by ins_date desc) r
    n
    14      from
    15          t_test t
    16      ,   t_log l 
    17      where
    18          t.ins_date >= l.last_run_date
    19      and l.prgrm_id = 1009
    20      )
    21  where rn = 1;
            ID    CUST_ID CUST_ACCT_ID INS_DATE
             3      12345       678905 31-05-2012 23:48:00
             6      12345       678905 01-06-2012 23:48:00
             9      12345       678905 02-06-2012 23:48:00But I now see that Bob already nailed it, while I was testing it ;)

  • How to model hierarchical data?

    I need a way to model hierarchical data. I have tried using an object so far, and it hasn't worked. Here is the code for the class I made: http://home.iprimus.com.au/deeps/StatsGroupClass.java. As you can see, there are 4 fields: 1 to store the name of the "group", 2 integer data fields, and 1 Vector field to store all descendants. Unfortunately, this this not seem to be working as the Vector get(int index) method returns an Object. This is the error I get:
    Test.java:23: cannot resolve symbol
    symbol  : method getGroupName  ()
    location: class java.lang.Object
          echo("Primary Structure with index 0: " + data.get(0).getGroupName());
                                                            ^
    1 error I figure I can't use the approach I have been using because of this.
    Can anyone help me out?

    Test.java:23: cannot resolve symbolsymbol  : method getGroupName  ()location: class java.lang.Object      echo("Primary Structure with index 0: " + data.get(0).getGroupName());                                                        ^1 errorYou need to cast the return value from get(0):
    ((YourFunkyClass)data.get(0)).getGroupName();Be aware that you're opening yourself up to the possibility of a runtime ClassCastException. You could consider using generics if you can guarantee that the data Vector will contain only instances of YouFunkyClass.
    Hope this helps

  • Random Records are being skipped while uploading data in PSA from CSV File

    Hi Experts,
    I am facing issue in data uploading in PSA through CSV file, Random Records are being skipped while uploading data in PSA.
    First Load
    We have flat file (.txt in CSV format), which contains 380240 Records.
    We are uploading the flat file data into PSA from Application Server. But it uploads Only 380235 records, 5 Records are being skipped.
    Second Load
    We have re-generated same file next day, which contains same No of Records (380240), but this time it uploads Only 380233 records, 7 Records are being skipped.
    We found 1 skipped record (based on key columns combination, by cross verifying from source and PSA table) from First load. But same records (combination of key column) is available in second load. It means same records are not being skipped every time.
    Earlier (5 months ago) , We have loaded 641190 Records from flat file in same PSA and all records (641190) were uploaded successfully.
    There is no change is Source, PSA and flat file structure.
    Thanks & Regards
    Bijendra

    Hi Bijendra,
        Please check in the file if at the begining if it has got any excape sign then that record may be skipped so the records may be mssing
    Please check the excape sign like ; if they are present at the beginign that recor entirely will be skipped.
    Regards
    vamsi

  • How to model this data

    In the sample data below, there are rows that contain header names, followed by a row with the data.
    The problem is that some of the "header" column values change.  They represent "sizes" boxes.
    Name | BoxType | Color | Qty | 45 | 11 | 13.5
    Compx | F | Red | 32 | 1 | 0 | 34
    Name | BoxType | Color | Qty | 75 | 11 | 12.5
    QuickMartZ | G | Blue | 68 | 13 | 7 | 77
    Name | BoxType | Color | Qty | 75 | 11 | 45
    QuickMartZ | F | Blue | 22 | 17 | 72 | 12
    How could I model this data or re-shape it into a schema such as
    Table
    =========
    AccountName
    BoxType
    Color
    Qty
    Size
    Ultimately I need to be able to extract a "rolled up" count of the boxes by size and their quantity
    Something like this
    AccountName
    BoxType
    Color
    Qty_Size1
    Qty_Size2
    Qty_Size3
    Qty_Size4
    Qty_Size5
    Qty_SizeN...

    Without some value which links the two rows together (other than the order of rows how do we know the box in the line above Compx belongs to it?) I don't think this is going to be possible as a set based solution.
    You could use a cursor to move through the rows RBAR:
    DECLARE @table TABLE (name VARCHAR(20), boxType VARCHAR(20), color VARCHAR(20), qty VARCHAR(4), col1 INT, col2 INT, col3 FLOAT)
    INSERT INTO @table (name, boxType, color, qty, col1, col2, col3)
    VALUES
    ('Name', 'BoxType', 'Color', 'Qty', 45, 11, 13.5),
    ('Compx', 'F', 'Red', '32', 1, 0 , 34),
    ('Name', 'BoxType', 'Color', 'Qty', 75, 11, 12.5),
    ('QuickMartZ', 'G', 'Blue', '68', 13, 7 , 77),
    ('Name', 'BoxType', 'Color', 'Qty', 75, 11, 45),
    ('QuickMartZ', 'F', 'Blue', '22', 17, 72, 12)
    DECLARE @name VARCHAR(20), @boxType VARCHAR(20), @color VARCHAR(20), @qty VARCHAR(4), @col1 INT, @col2 INT, @col3 FLOAT,
    @pname VARCHAR(20), @pboxType VARCHAR(20), @pcolor VARCHAR(20), @pqty VARCHAR(4), @pcol1 INT, @pcol2 INT, @pcol3 FLOAT
    DECLARE @products TABLE (name VARCHAR(20), boxType VARCHAR(20), color VARCHAR(20), qty VARCHAR(4), size1 FLOAT, size2 FLOAT, size3 FLOAT)
    DECLARE @boxes TABLE (name VARCHAR(20), boxType VARCHAR(20), size1 FLOAT, size2 FLOAT, size3 FLOAT)
    DECLARE c1 CURSOR
    FOR SELECT *
    FROM @table
    OPEN c1
    FETCH c1 INTO @name, @boxType, @color, @qty, @col1, @col2, @col3
    WHILE @@FETCH_STATUS <> -1
    BEGIN
    IF @name = 'name'
    BEGIN
    SET @pname = @name
    SET @pboxType = @boxType
    SET @pcolor = @color
    SET @pqty = @qty
    SET @pcol1 = @col1
    SET @pcol2 = @col2
    SET @pcol3 = @col3
    END
    IF @name <> 'name'
    BEGIN
    INSERT INTO @products (name, boxType, color, qty, size1, size2, size3) VALUES (@name, @boxType, @color, @qty, @col1, @col2, @col3)
    INSERT INTO @boxes (name, boxType, size1, size2, size3) VALUES (@name, @boxType, @pcol1, @pcol2, @pcol3)
    END
    FETCH c1 INTO @name, @boxType, @color, @qty, @col1, @col2, @col3
    END
    CLOSE c1
    DEALLOCATE c1
    SELECT *
    FROM @products
    SELECT *
    FROM @boxes

  • Please prefix 'mp4:' to the stream name to record H264/AAC/HE-AAC encoded data at FMS using DVR...

    I was able to modify the main.asc file in the application/livepkgr directory to include:
    * DVRSetStreamInfo :
    * This prototype was created to allow DVR recording funtionality to FLME to FMS
    Client.prototype.DVRSetStreamInfo = function (info)
        var s = Stream.get( "mp4:" + info.streamname + ".f4v" ) ;
        if (s)
            if (info.append)
            s.record ("append") ;
            else
            s.record () ;
            s.play (info.streamname) ;
    I get three status messages in the FLME encoding log:
    Requested DVR command has been successfully issued to Primary FMS server for stream livestream1
    Requested DVR command has been successfully issued to Primary FMS server for stream livestream2
    Please prefix 'mp4:' to the stream name to record H264/AAC/HE-AAC encoded data at FMS using DVR feature
    Now I have a few issues:
    1. How to I fix the issue with the 3rd status message "Please prefix 'mp4:' to the stream name to record H264/AAC/HE-AAC encoded data at FMS using DVR feature" since the code above is prefixing the "mp4:" to the stream name.
    2. In the applications/livepkgr/streams directory there is a file called "undefined.f4v" which is telling me the code above isn't passing the info.streamname variable.
    3. Also, what do I need to do to playback this .f4v file. I've tried opening it with Adobe Media Player, but it doesn't recognize it.
    I'm obviously using multi-bitrate streaming and that is working flawlessly.  My goal is to record this livestream to later playback as an mp4 file.
    Any ideas?
    UPDATE:
    I know on page 15 of the FMS 4.5.1 developers guide, "Configure DVR (HDS)" under "Publish a DVR stream", it states "To Publish a DVR stream from FLME, DO NOT check Record OR check DVR Auto Record. Publish the stream just as you publish any live stream. In the next section "Play DVR streams" is states that I can use SMP (Stobe Media Playback), which I am.  So that brings up two more questions:
    1. Why is my SMP Player not displaying the DVR funtionality (See image):
    In my SMP configuration, my streamType is "LiveOrRecorded" - default. There is a streamType = "dvr", but will I lose live funtionality?
    2. If I want to later on, package the stream into an mp4 file and play it back later for those who missed the live stream, what is the best approach for that?
    Message was edited by: giostefani

    Two things: You need to have "DVRCast" application on server-side i.e. "dvrcast_origin" for DVR recording to work and secondly for mp4 recording your stream name should be "mp4:<streamname>.mp4" while publishing.

  • Select record according to latest date

    I have two fields in my DB table , one is region and other is date .
    There are many records for region (R1 lets say).
    My requirement is that I want the record (only one record) with the latest date for this region .

    hi,
    try like this.
    SELECT - aggregate
    Syntax
    ... { MAX( [DISTINCT] col )
        | MIN( [DISTINCT] col )
        | AVG( [DISTINCT] col )
        | SUM( [DISTINCT] col )
        | COUNT( DISTINCT col )
        | COUNT( * )
        | count(*) } ... .
    Effect
    As many of the specified column labels as you like can be listed in the SELECT command as arguments of the above aggregate expression. In aggregate expressions, a single value is calculated from the values of multiple rows in a column as follows (note that the addition DISTINCT excludes double values from the calculation):
    MAX( [DISTINCT] col ) Determines the maximum value of the value in the column col in the resulting set or in the current group.
    MIN( [DISTINCT] col ) Determines the minimum value of the content of the column col in the resulting set or in the current group.
    AVG( [DISTINCT] col ) Determines the average value of the content of the column col in the resulting set or in the current group. The data type of the column has to be numerical.
    SUM( [DISTINCT] col ) Determines the sum of the content of the column col in the resulting set or in the current group. The data type of the column has to be numerical.
    COUNT( DISTINCT col ) Determines the number of different values in the column col in the resulting set or in the current group.
    COUNT( * ) (or count(*)) Determines the number of rows in the resulting set or in the current group. No column label is specified in this case.
    reward if useful.

Maybe you are looking for

  • In continous foto shooting mode (10pics/sec) the order of the photos in photo stream is no longer correct!

    Photostream does not rank the photos correct if they are shot in continous mode, so that up to 10 pics/sec have the same creation date. Some of these photos are edited, some not. Maybe it is associated with that. I don´t know if it´s an Aperture or t

  • How can I get rid of Koobface on my IMAC??

    Are there specific Malware programs that work better with Apple products??

  • What is this "multiple cell" warning?

    While i'am talking with mother, in the same time my dad is called me, (call waiting enabled) a missed call has occurred and when i drop the phone and look on the screen, i have 1 missed call: dad , but in the same screen i see a different warning (lo

  • Best external drive?

    I wondered why my Mini was running slower and slower. Apparently it had to do with my external drive expiring. It was a 200 GB LaCie at least 5 years old. So now I need to replace it (probably with a bigger one). What's the current favorites? I'm ala

  • Com.sap.pct.srm.core configuration missing in Portal for SRM

    Hi, We have installed a new portal 7.0 ehp1. Installed the SRM business package. configured all the roles. Configured UWL But in Universal workflow configuration content , com.sap.pct.srm.core is missing. In our development system , it was already pr