Spliting data wrt data available in the columns

I have one table with column id,amnt1 and amnt2 with id as primary key
WITH table_1 AS
select '1' id, '200' amnt1,'100' amnt2 from dual union all
select '2' id, '200' amnt1,'' amnt2 from dual union all
select '3' id, '' amnt1,'100' amnt2 from dual union all
select '4' id, '50' amnt1,'' amnt2 from dualunion all
select '5' id, '150' amnt1,'270' amnt2 from dualunion all
select * from table_1
Depending up on the data in amnt1 and amnt2 i need to split the record.
In first case id = 1 i need to check if the amount is available in the columns.If both columns amnt is available then split that record in to 2 as below
WITH table_1 AS
select '1' id, '200' amnt1,'' amnt2 from dual union all
select '1' id, '' amnt1, '100' amnt2 from dual
select * from table_1
in second case id =2 only one row as only one row only one amnt contains data
WITH table_1 AS
select '2' id, '200' amnt1,'' amnt2 from dual
select * from table_1
In third case one row and in fourt case 1 row and fith case again two rows.
Basically if amnt1 and amnt2 contains data it has to split in to two.So the final result will be like
WITH table_1 AS
select '1' id, '200' amnt1,'' amnt2 from dual union all
select '1' id, '' amnt1,'100' amnt2 from dual union all
select '2' id, '200' amnt1,'' amnt2 from dual union all
select '3' id, '' amnt1,'100' amnt2 from dual union all
select '4' id, '50' amnt1,'' amnt2 from dual union all
select '5' id, '150' amnt1,'' amnt2 from dual union all
select '5' id, '' amnt1,'270' amnt2 from dual
select * from table_1
Please help
WITH table_1 AS
select '200' amnt1,'2010-02-02' date1,'100' amnt2,'2010-03-08' date2 from dual union all
select '500' amnt1,'2010-02-15' date1,'300' amnt2,'2010-02-08' date2 from dual union all
select '500' amnt1,'2010-02-18' date1,'300' amnt2,'2010-04-09' date2 from dual
select * from table_1

user10285699 wrote:
Not correct.this is not what i need.sorryYou said the final result should be...
SQL> WITH table_1 AS
  2  (
  3  select '1' id, '200' amnt1,'' amnt2 from dual union all
  4  select '1' id, '' amnt1,'100' amnt2 from dual union all
  5  select '2' id, '200' amnt1,'' amnt2 from dual union all
  6  select '3' id, '' amnt1,'100' amnt2 from dual union all
  7  select '4' id, '50' amnt1,'' amnt2 from dual union all
  8  select '5' id, '150' amnt1,'' amnt2 from dual union all
  9  select '5' id, '' amnt1,'270' amnt2 from dual
10  )
11  select * from table_1
12  /
I AMN AMN
1 200
1     100
2 200
3     100
4 50
5 150
5     270Saad Nayef's solution gives...
SQL> WITH table_1 AS (SELECT '1' id, '200' amnt1, '100' amnt2 FROM DUAL
  2                   UNION ALL
  3                   SELECT '2' id, '200' amnt1, '' amnt2 FROM DUAL
  4                   UNION ALL
  5                   SELECT '3' id, '' amnt1, '100' amnt2 FROM DUAL
  6                   UNION ALL
  7                   SELECT '4' id, '50' amnt1, '' amnt2 FROM DUAL
  8                   UNION ALL
  9                   SELECT '5' id, '150' amnt1, '270' amnt2 FROM DUAL)
10  SELECT id, amnt1, NULL amnt2 FROM table_1
11  UNION ALL
12  SELECT id, NULL, amnt2 FROM table_1
13  MINUS
14  SELECT id, NULL, NULL FROM table_1
15  /
I AMN AMN
1 200
1     100
2 200
3     100
4 50
5 150
5     270And if I did it my way I would get...
SQL> ed
Wrote file afiedt.buf
  1  WITH table_1 AS
  2  (
  3  select '1' id, '200' amnt1,'100' amnt2 from dual union all
  4  select '2' id, '200' amnt1,'' amnt2 from dual union all
  5  select '3' id, '' amnt1,'100' amnt2 from dual union all
  6  select '4' id, '50' amnt1,'' amnt2 from dual union all
  7  select '5' id, '150' amnt1,'270' amnt2 from dual
  8  )
  9  select id, amnt1, null amnt2 from table_1 where amnt1 is not null
10  union all
11  select id, null, amnt2 from table_1 where amnt2 is not null
12* order by 1, 2, 3
SQL> /
I AMN AMN
1 200
1     100
2 200
3     100
4 50
5 150
5     270All of which are identical results.
If there's something wrong in those, you'd better explain what you want, because you've been given a correct answer (hence why I didn't answer this question myself earlier as I could see it was correct).

Similar Messages

  • No data fields are available in the OLAP cube

    i was able to access data in Excel yesterday, but now all the cubes that i have created cannot be accessed by Excel any more and after authenticating and selecting the cell for the pivot table i get this error:
    No data fields are available in the OLAP cube
    i just hope this is some process in HANA that needs restarting and not some endemic MDX problem. if so, where do i look?
    has anyone seen this?
    BTW, shouldn't there be a single log on for the connection? i have to enter credentials twice: 1 when creating the connection and 2 when opening the pivot table.

    something/someone has locked my userid when i was recreating the steps in Excel (i don't think i have tried 'wrong' passwords, but i did try a ['wrong' user|http://misiorek.com/h/GMSnap206%202011-11-27.jpg]
    Here are the steps:
    1. Open [Excel|http://misiorek.com/h/GMSnap202%202011-11-27.jpg].
    2. Enter [connections|http://misiorek.com/h/GMSnap203%202011-11-27.jpg].
    3. Select [HANA cube|http://misiorek.com/h/GMSnap204%202011-11-27.jpg].
    4. Select [Pivot cell|http://misiorek.com/h/GMSnap205%202011-11-27.jpg].
    5. Enter credentials (see above).
    I'm also not sure why I see these security messages:
    [locked|http://misiorek.com/h/GMSnap207%202011-11-27.jpg], Microsoft [warning|http://misiorek.com/h/GMSnap208%202011-11-27.jpg], HANA [warning|http://misiorek.com/h/GMSnap209%202011-11-27.jpg], HANA db [warning|http://misiorek.com/h/GMSnap210%202011-11-27.jpg]

  • Data Manager not available in the BPC 7.0 Action Pane

    Hi,
    I have just installed BPC 7.0 and everything is working fine, except I don't have access to the Data Manager.
    According to the BPC guide, this is how to start Data Manager:
    1. Click the Business Planning and Consolidation icon on your desktop.
    2. From the Business Planning and Consolidation launch page, select Business Planning and Consolidation for Excel.
    3. From the Getting Started - BPC for Excel action pane, select Manage Data.
    I don't have "Manage Data" as an option in the Action Pane. The options I have under Available Task Categories are: "Reporting & Analysis", "Journals" and "Open System Reports".
    I also do not have a menu called eData.
    Any ideas on what I can do?
    Thanks,
    Sam

    Please check your task profile....
    Make sure you  have all task profile related to data manager....do let us know if youstill face this problem

  • Why is the movie Date Night not available in the US store?

    Date Night seems to be available in the Austrailia store (http://itunes.apple.com/au/movie/date-night/id370805508), but not the US store.

    Please note that content purchased from the iTunes Store is country-specific because of Digital Rights Management and Copyright laws that depend on the country you are in and the country the content you want is registered under.
    If you want content on the AU store, you will need to change your account's address and payment information to one that is valid in the AU store. Otherwise, You will need to wait until the studios give iTunes permission to release the movie.

  • Datas are not available to the query when upload a Infopackage

    Hello People,
    I Would like to know what can be the reason ..why the datas are not available when we execute an upload from FlatFile to the Cube (this one is the last target one).
    Considering tere aren´t any poroblem in may target (Cube)
    The system shows us the upload was ok, but the status about the "available datas to the query" was not activated.
    Thanks,
    Rosana.

    Hi ,
    Please go to T code SE16--> Table RSDDAGGR
    Here AggrUID(STARTUID)  give not equal to blank, you will get list of all the cubes which has aggregates.
    If your cybe is in this list , you need to rollup the data to mae this available in reporting.You can rollupp the data by going into Manage of the cube.
    -Vikram

  • Which Master data Source is available for the field VBUK-GBSTK (Doc Status)

    Hi
    Which Master data Source (Attr) is available for the field VBUK-GBSTK (Document Status).
    Please help me.
    Thanks
    Mannev.

    Hi
    Thanks for your reply.
    That is Transaction data source. I want master data source.
    Thanks,
    Mannev.

  • Bcp doesnt throw an error when the data length exceeds size of the column

    Hi,
    We are using bcp in SQL 2008 R2 to import the data from flat file. When the data length exceeds the size of the column, it doesn't throw any error instead it has ignored the row.
    Please suggest me how to truncate and load the data into table.
    Thanks,
    Pasha

    Hi Pasha,
    According to your description, you want to import the data from flat file to SQL Server table with truncated data in SQL Server 2008 R2. To achieve your requirement, we can use Import and Export wizard. For more details, please refer to the following steps:
    Launch SSMS by clicking SQL Server Management Studio from the Microsoft SQL Server program group.
    Right click on the destination database in the Object Explorer, select Tasks, then Import Data from the context menu to launch the Import Wizard.
    Choose Flat File Source as Data Source, then browser to the flat file.
    Choose SQL Server Native Client 10.0 as Destination, then select the destination database.
    Click Edit Mappings button to change column size or other properties.
    Finish the processing.
    For the example about how to use Import and Export wizard, please refer to the blog below:
    http://www.mssqltips.com/sqlservertutorial/203/simple-way-to-import-data-into-sql-server/
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Assets upload: Deactivation Date is not available in the Standard Program

    Hi,
    I am trying to use the standard program for Assets upload,RAALTD01. But, the field de-activation date (ANLA-DEAKT) is not available for mapping. This is the only field am missing out of more than 200 fields involved in the upload. What would be the best approach inorder that I can use LSMW and at the same time map this deactivation date field ; provided we are not using the Recording method.
    Looking forward to your replies.
    Regards,
    Sophia Xavier

    you can use ABAP program with BAPI 
    call function 'BAPI_FIXEDASSET_OVRTAKE_CREATE'

  • Leaving date is not available in the Query report

    Hi,
    I had a problem that the leaving date is not generated while running a report from the sap standard report.
    Note:
    I has assigned the leaving action config as
    Coustomer Specific : 3
    Employment: 0
    Specific Payment: 0
    Should we look into ADMIN LDATE switch
    Anything to do with Feater "LEAVE"
    Or anyother issue pls.
    Thanks for the community.
    Regards
    sekhar

    Hi,
    Run the report RPLMIT00 and in the selection criteria select employment statusis not equal to 3.
    Execute the report .
    You can find the leaving date.
    Warm Regards,
    Kapil Kaushal

  • What type of data I must use in the column (SQL)?

    Hi Guys,
    I have a little bit problem with this value (2_1232_123). I don´t know what type of value can be. This value is the result of the concatenation of three integer and "_".
    The code I have is:
    While readerCodeAccount.Read
    Code_Account = readerCodeAccount.Item(0)
    End While
    Try
    IDTextBox.Text = Code_Company & "-" & Code_Account

    varchar(20) 
    But  why storing it in the database if you can generate the concatenation on the fly in the SELECT statement
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Date stamp the column

    Hi
    I have a requirement, in a column i need to show the balnces for date 2012/12/31. The column name is Amount Balance, and I need to show that date at the top of the column .Please help me how to achieve this.
    Thanks
    SR

    I don't believe it - I could have sworn I'd tried that!
    Cheers pal.
    That's fixed it for me.

  • TDMS Data appears in row instead of column

    Hi,
    I am working on a program that reads in temperature data from a TDMS file, shifts the data through a 'normalising' equation and puts it back into the same TDMS file onto a different page.
    The problem I am getting is when the normalised data gets written back into the TDMS file, the data that should appear in the columns now appears in rows. See picture attached that illustrates this.
    Does anyone know how I can write the data to the file so that it appears in the column and not the row. In my VI you will see that I have had to transpose the 2D array otherwise all the data just appears in one single row.
    Also several cells just containing 0 have appeared in my data set which should not be there.
    I will attach my VI to this. I will also attach one of the TDMS data files.
    Thanks in advance,
    Rhys
    Solved!
    Go to Solution.
    Attachments:
    Row Column Switch.png ‏285 KB
    normalising program.vi ‏36 KB
    TDMS Files.zip ‏18 KB

    Hi Rhys,
    After looking into your normalising program.vi, I would recommend:
    Don't use that Transpose 2D array, it doesn't solve your problem.
    Write the "normalized" data to different channels, you get all the data just appears in one single column because you write all the “normalized” data to one single channel repeatedly, thus they'll appear in one column(channel).
    Several cells containing zeros is because float64 y1[30] array in your normalising equation, you need remove the zero elements from y1[30] before writing to file.
    I attached the modified normalising program.vi, hope this can do some help to your problem.
    The snapshot below shows the data in Excel after "normalising" equation, the channel data appears in columns.
    Attachments:
    normalising program(updated).vi ‏37 KB

  • Filter data on any one of the field in Hirarchical ALV?

    Hi,
    Can any one tell me if I am using Hirarchical ALV for outputting the data, and I have to filter out the output  data on any one of the column in the Hirarchical ALV, Is it possible ?
    Can we filter out the data on the basis of field which is not the key field in the Hirarchical ALV?
    Mrunal

    Try using IT_FILTER...SLIS_T_FILTER_ALV...I guess...
    santhosh

  • Convert data into Date Format imported from MS SQL Server.

    I have imported Data from MS SQL Server. The "Date Column" received in number format like 41017.6361109954. How can i convert it into Date in Oracle SQL.
    If i import same Data in Excel and change the Column Type to Date. It changes successfully. But in Oracle, I tried To_Date function with different parameters but it didn't work.
    Edited by: XAVER on Apr 22, 2012 2:31 AM

    XAVER wrote:
    The actual date for 41017.6361109954 is 22-Apr-2012 but its showing 20-APR-2082It looks like offset is January 1, 2000:
    select timestamp '2000-01-01 00:00:00' + numtodsinterval(41017.6361109954,'day') from dual;
    20-APR-12 03.15.59.990002560 PM
    SQL> SY.

  • Missing Data Target in Infopackage for Update ODS Data in Data Target Cube

    Hello & Best Wishes for the New Year to all of you,
    I have 3 ODS (1 on Full Update and 2 with Delta Updates). All these 3 ODS update data to a single CUBE. In my development system this works correctly. Data load from PSA to ODS to Cube.
    Now I transported this to my QA and Production System. In QA and Production System, I am able to load data upto all the 3 ODS and ACTIVATE Data in all these 3 ODS.
    When I am trying for "Update ODS Data in Data Target" to load data from ODS to Cube in QA /PD, the system created Infopackage (ODS to CUBE), doesnot get the Data Target details. (Initial Upload / Full Upload). The Data Target Tab is Blank (expected Cube details).
    I have tried to transport again after deleting the update rules.
    Can you suggest what could be the problem ?
    regards - Rajesh Sarin

    Thanks Dinesh,
    I have the ODS to CUBE Update Rules ACTIVE in the QA and PD system. Still the problem exists only in QA and PD. In DV the ODS to CUBE Data Target is available in the Infopackage and loading the data correctly to Cube.
    Listing all the trials I have done :
    1) Originally transported with all the Collected Objects. Inspite of having Active Update Rules, Data Target Cube was missing in QA and PD.
    2) After this problem, I again transported only the Update Rules through the Transport Connection, still the problem didnot get solved in QA and PD.
    3) Again, I sent a transport to delete the Update Rules which deleted the ODS to CUBE update rules in (DV), QA, PD. After that I sent another request to CREATE the ODS To CUBE UPDATE Rules in (DV), QA and PD. Still the Data Target is missing, inspite of having Active Update Rules in QA and PD.
    In DV the ODS to CUBE Data Target is available and loading the data correctly to Cube, even now.
    4) I have also tried "Generate Export Data Source" for the 3 ODS in QA and PD. Still it doesnot help.
    Can you please suggest ?
    regards - Rajesh Sarin

Maybe you are looking for