V$SESSION source table

Hi All,
what is base table for V$SESSION ?
And in which schema i can found it ?
Actually i want to search for a particular user login time and his PC information before 1 month
Regards

v$ tables are stored in control files.No they are not. They are in memory ONLY.
after successful startup mount command, you can
access to v$ tables & views.That part is mostly true. (Except that there are no 'V$' tables. The V$ view are actually public synonyms to V_$ views owned by SYS, which are built on X$ tables.)

Similar Messages

  • How to access ( DML) not a source table?

    Hello,
    I created a form ( using wizard ) based on the table.
    I want to insert into another table based on the button. I'm playing around with the "Automatic Row Processing (DML)" process "source table".
    And this is what i'm getting:
    0.03: Processing point: ON_SUBMIT_BEFORE_COMPUTATION
    0.03: Branch point: BEFORE_COMPUTATION
    0.03: Computation point: AFTER_SUBMIT
    0.03: Tabs: Perform Branching for Tab Requests
    0.03: Branch point: BEFORE_VALIDATION
    0.03: Perform validations:
    0.03: ...Item Not Null Validation: P19_APPL_BUSINESS_AREA
    0.03: ...Item Not Null Validation: P19_APPL_SERVER
    0.03: ...Item Not Null Validation: P19_NSM_SUPPORT_LEVEL
    0.03: ...Item Not Null Validation: P19_ENV_TYPE
    0.03: ...Item Not Null Validation: P19_BACKUP
    0.03: ...Item Not Null Validation: P19_PUBL_MAILING_LST
    0.03: ...Item Not Null Validation: P19_BUSINESS_REQS
    0.03: Branch point: BEFORE_PROCESSING
    0.03: Processing point: AFTER_SUBMIT
    0.03: ...Process "insert_tmp": DML_PROCESS_ROW (AFTER_SUBMIT) #OWNER#:tmp_guidebook:P19_ID:ID|I
    0.03: Show ERROR page...
    0.03: Performing rollback...
              "MYUSER_ID HERE"
    ORA-20513: Invalid action SAVE on this object. (I)
         Error      ORA-20513: Invalid action SAVE on this object. (I)
    I'm very confused with the error message, since i 3-ple check the table. and the "delete" process on the *source* table works fine.
    Please help.
    I also found Andy's post suggesting to create an PL/SQL routine and call it by "on submit" event.
    http://forums.oracle.com/forums/thread.jspa?threadID=825248&start=15&tstart=0
    Just curious if it's possible to make this action using standard form wo going with Andy's suggestion.
    Please advise.
    Andrew

    Andy, thank you.
    It works now.
    A quick ( hopefully ) follow up question :
    once i inserted the row, i want to remove that row from the source table.
    right now i have unconditional process , but it's failing.
    See here:
    0.03: Processing point: ON_SUBMIT_BEFORE_COMPUTATION
    0.03: Branch point: BEFORE_COMPUTATION
    0.03: Computation point: AFTER_SUBMIT
    0.03: Tabs: Perform Branching for Tab Requests
    0.03: Branch point: BEFORE_VALIDATION
    0.03: Perform validations:
    0.03: ...Item Not Null Validation: P19_APPL_BUSINESS_AREA
    0.03: ...Item Not Null Validation: P19_APPL_SERVER
    0.03: ...Item Not Null Validation: P19_NSM_SUPPORT_LEVEL
    0.03: ...Item Not Null Validation: P19_ENV_TYPE
    0.03: ...Item Not Null Validation: P19_BACKUP
    0.03: ...Item Not Null Validation: P19_PUBL_MAILING_LST
    0.03: ...Item Not Null Validation: P19_BUSINESS_REQS
    0.03: Branch point: BEFORE_PROCESSING
    0.03: Processing point: AFTER_SUBMIT
    0.03: ...Process "Insert_Approved_GB_Process": DML_PROCESS_ROW (AFTER_SUBMIT) #OWNER#:TMP_GUIDEBOOK:P19_ID:ID|I
    0.05: ...Session State: Save "P19_ID" - saving same value: "44"
    0.05: ...Process "Delete_process": DML_PROCESS_ROW (AFTER_SUBMIT) #OWNER#:GB_PENDING_RQST:P19_ID:ID|D
    0.05: Show ERROR page...
    0.05: Performing rollback...
              "My user id here"
    Request id = 44 has been approved
    ORA-20512: Invalid action SAVE on this object. (D)
         Error      ORA-20512: Invalid action SAVE on this object. (D)
    OK      
    And the processes are defined on the page :
    Processes     Edit AllCopyCreate
    After Submit
    20     Insert_Approved_GB_Process     Automatic Row Processing (DML)     Conditional
    30     Delete_process     Automatic Row Processing (DML)     Unconditional
    40     reset page     Clear Cache for all Items on Pages (PageID,PageID,PageID)     Conditional
    I'm implying that the "delete" should only occur if "insert" was successful ...
    ..And, to make life interesting, i have a button "Delete" that invokes the delete process to explicitly remove the row from the source table.
    thank you
    Andrew
    Edited by: andrey on Dec 11, 2008 10:35 AM

  • How to get Materialized View to ignore unused columns in source table

    When updating a column in a source table, records are generated in the corresponding materialized view log table. This happens even if the column being updated is not used in any MV that references the source table. That could be OK, so long as those updates are ignored. However they are not ignored, so when the MV is fast refreshed, I find it can take over a minute, even though no changes are required or made. Is there some way of configuring the materialized view log such that the materialized view refresh ignores these updates ?
    So for examle if I have table TEST:
    CREATE table test (
    d_id NUMBER(10) PRIMARY KEY,
    d_name VARCHAR2(100),
    d_desc VARCHAR2(256)
    This has an MV log MLOG$_TEST:
    CREATE MATERIALIZED VIEW LOG ON TEST with rowid, sequence, primary key;
    CREATE MATERIALIZED VIEW test_mv
    refresh fast on demand
    as
    select d_id, d_name
    from test;
    INSERT 200,000 records
    exec dbms_mview.refresh('TEST_MV','f');
    update test set d_desc = upper(d_desc) ;
    exec dbms_mview.refresh('TEST_MV','f'); -- This takes 37 seconds, yet no changes are required.
    Oracle 10g/11g

    I would love to hear a positive answer to this question - I have the exact same issue :-)
    In the "old" days (version 8 I think it was) populating the materialized view logs was done by Oracle auto-creating triggers on the base table. A "trick" could then make that trigger become "FOR UPDATE OF <used_column_list>". Now-a-days it has been internalized so such "triggers" are not visible and modifiable by us mere mortals.
    I have not found a way to explicitly tell Oracle "only populate MV log for updates of these columns." I think the underlying reason is that the MV log potentially could be used for several different materialized views at possibly several different target databases. So to be safe that the MV log can be used for any MV created in the future - Oracle always populates MV log at any update (I think.)
    One way around the problem is to migrate to STREAMS replication rather than materialized views - but it seems to me like swatting a fly with a bowling ball...
    One thing to be aware of: Once the MV log has been "bloated" with a lot of unneccessary logging, you may perhaps see that all your FAST REFRESHes afterwards becomes slow - even after the one that checked all the 200000 unneccessary updates. We have seen that it can happen that Oracle decides on full table scanning the MV log when it does a fast refresh - which usually makes sense. But after a "bloat" has happened, the high water mark of the MV log is now unnaturally high, which can make the full table scan slow by scanning a lot of empty blocks.
    We have a nightly job that checks each MV log if it is empty. If it is empty, it locks the MV log and locks the base table, checks for emptiness again, and truncates the MV log if it is still empty, before finally unlocking the tables. That way if an update during the day has happened to bloat the MV log, all the empty space in the MV log will be reclaimed at night.
    But I hope someone can answer both you and me with a better solution ;-)

  • Data Federator- Merging two source tables

    Hi,
    I have two identically structured source tables and i need to merge them together to a target table.
    For Example:-
    Source Table1=  { (A,10), (B,11), (C,12) }
    +
    Source Table2=  { (D,15), (E,16), (F,17) }
    Target Table=  { (A,10), (B,11), (C,12), (D,15), (E,16), (F,17)}
    Can you please help on how achieve this in data federator.
    Thanks in advance.

    For Merge - Create a target table with your desired end columns A - E.  Create 2 seperate mappings and map the respective columnns from each source.
    For Join - Create a target table with your desired end columns A - E.  Create 1 mappings and map the respective columnns from each source and also creating the necessary relationship between tables.

  • How to delete the source table rows once loaded in Destination Table in SSIS?

    Data Base=kssdata
    Tables= Userdetails having 1000 rows
    Using SSIS: 
    Taking A  
    OLE DB Source----------------->OLE DB Destination
    Am Taking 200 rows in Source table and loaded into Destination table once
    Constraint: here once 200 rows are exported in destination table , that 200 rows are deleted in source table
    repeat the task as source table all the records are loaded into Destination table 
    After that am taking another 200 rows in source table and loaded into Destination table

    Provided you've a sequential primary key  or audit timestamp (datetime/date) column in the table you can do an approach like this
    1. Add a execute sql task connectng to source db with below statement
    SELECT COUNT(*) FROM table
    Store the result in a variable
    2. Have another variable and set it to below expression
    (@[User::CountVariable]/200) + (@[User::CountVariable]%200 >0? 1:0)
    by setting EvaluatesExpression as true. Here CountVariable is variable created in previous step
    3. Have a for loop container with below settings
    InitExpression
    @NewVariable = @CounterVariable
    EvalExpression
    @NewVariable > 0
    AssignExpression
    @NewVariable = @NewVariable - 1
    3. Add a data flow task with OLEDB source and OLEDB Destination
    4. Use source query as
    SELECT TOP 200 columns...
    FROM Table
    ORDER BY [PK | AuditColumn]
    Use PK or audit column depending which one is sequential
    5. After data flow task have a execute sql task with statement as below
    DELETE t
    FROM (SELECT ROW_NUMBER() OVER (ORDER BY PK) AS Rn
    FROM Table)t
    WHERE Rn <= 200
    This will make sure the 200 records gets deleted each time
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • How do I create a target table with the same PK as the source table?

    I am trying to create a target table in a mapping that will end up with the same primary key as the source table.
    It is a simple map that simply uses a subset of the columns of the source table in the target table. I was wanting to create and bind a new table by dragging the columns I want from the source to the initially blank target table operator, change the column names and create a primary key to match the source table.
    I can't seem to be able to create a constraint on the table in the mapping. I can create the constraint after the table is created and boound to the database object but the PK doesn't carry back into the mapping.
    I need it in the mapping so I can use the UPDATE/INSERT operation and use the 'All Constraints' implementation. The mapping won't let me validate the object without the PK on it in the map.
    Believe it or not folks, I am getting better at this.
    Thanks very much for the guidance.
    Gary

    Hi Gary
    You are close, you are really close... :-))
    You need to do exactly as you propose plus one extra step. Build the map as you describe, binding the new table to the target. Then you edit the table definition to add the primary key and any other constraints you need. After this is the step that you are missing.
    You need to do the following:
    1. Go back and re-edit the map
    2. Right click on the table
    3. From the pop up menu, select Reconcile Inbound
    4. Set any operators that you need for the UPDATE/INSERT
    5. Save the map
    6. Commit your changes
    The first three steps above make the map read in the indexes and constraints that you set on the table. Finally, you need to deploy the table and then deploy the map.
    Hope this helps
    Regards
    Michael

  • How to get source table inside Template Mapping code template

    Hi guys,
    I have the following scenario, I have an table from external database and want to map it to an oracle table. This is done with Template mapping and I selected an Load code template on the execution unit that holds only the external table, this load code template will read row by row from source table and make the inserts into the flow table. I know that oracle use odiRef.getFrom() in order to construct the select statement from the external table. Because i need to do something custom i will need to have a list of the source tables inside the Load code template.
    Is this possible?
    P.S. I use owb 11gr2.
    Regards,
    Cipi
    Edited by: Iancu Ciprian on Jan 11, 2011 10:58 AM

    Hi Suraj,
    Thx for your answer!
    After posting the message i found in ODI documentation about odiRef other function and this I'm trying now to see if works, will let you know my results ...
    I implemented an custom iterator that retrieves the data from an external source and pass it to INSERT commands to execute against flow table. In order that this iterator to work i need the source table name of the current execution unit. Then the iterator is using the that name to get the data from the external entity and retrieve it as an array of Objects, this array of objects will be inserted in the flow table.
    Regards,
    Cipi

  • How to get source table name according to target table

    hi all
    another question:
    once a map was created and deployed,the corresponding information was stored in the repository and rtr repository.My question is how to find the source table name according to the target table,and in which table these records are recorded.
    somebody help me plz!!
    thanks a lot!

    This is a query that will get you the operators in a mapping. To get source and targets you will need some additional information but this should get you started:
    set pages 999
    col PROJECT format a20
    col MODULE format a20
    col MAPPING format a25
    col OPERATOR format a20
    col OP_TYPE format a15
    select mod.project_name PROJECT
    , map.information_system_name MODULE
    , map.map_name MAPPING
    , cmp.map_component_name OPERATOR
    , cmp.operator_type OP_TYPE
    from all_iv_xform_maps map
    , all_iv_modules mod
    , all_iv_xform_map_components cmp
    where mod.information_system_id = map.information_system_id
    and map.map_id = cmp.map_id
    and mod.project_name = '&Project'
    order by 1,2,3
    Jean-Pierre

  • How to get source table name

    Hi,
    I need to know how get a source table name. I need to get a source table name and do some transformation. I am adding the step in IKM to do this and therefore need source table name in there.
    <%=odiRef.getSrcTablesList("","[RES_NAME]","","")%> gives work table name (C$_0XXXXX) whereas actual source table name is expected.
    Could someone please help?
    Thanks.

    Hi,
    May i add a point?
    In IKM level, if u use this API it will always return u C$ table name only since for IKM C$ is the source. You need to capture and use this API in LKM level.
    In LKM add a step in Command on target and technology as Jython and try the below code.
    mySourceTable= '<%=odiRef.getSrcTablesList("", "[RES_NAME]", ", ", "")%>'
    And later in IKM use this variable for ur transformation.
    Thanks,
    Guru

  • SSIS 2012 is intermittently failing with below "Invalid date format" while importing data from a source table into a Destination table with same exact schema.

    We migrated Packages from SSIS 2008 to 2012. The Package is working fine in all the environments except in one of our environment.
    SSIS 2012 is intermittently failing with below error while importing data from a source table into a Destination table with same exact schema.
    Error: 2014-01-28 15:52:05.19
       Code: 0x80004005
       Source: xxxxxxxx SSIS.Pipeline
       Description: Unspecified error
    End Error
    Error: 2014-01-28 15:52:05.19
       Code: 0xC0202009
       Source: Process xxxxxx Load TableName [48]
       Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description: "Invalid date format".
    End Error
    Error: 2014-01-28 15:52:05.19
       Code: 0xC020901C
       Source: Process xxxxxxxx Load TableName [48]
       Description: There was an error with Load TableName.Inputs[OLE DB Destination Input].Columns[Updated] on Load TableName.Inputs[OLE DB Destination Input]. The column status returned was: "Conversion failed because the data value overflowed
    the specified type.".
    End Error
    But when we reorder the column in "Updated" in Destination table, the package is importing data successfully.
    This looks like bug to me, Any suggestion?

    Hi Mohideen,
    Based on my research, the issue might be related to one of the following factors:
    Memory pressure. Check there is a memory challenge when the issue occurs. In addition, if the package runs in 32-bit runtime on the specific server, use the 64-bit runtime instead.
    A known issue with SQL Native Client. As a workaround, use .NET data provider instead of SNAC.
    Hope this helps.
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • Error in updating the source table in mapping

    Hi All,
    I have a mapping in which I am fetching records from a table A, performing some expression logic and then using a Splitter operator to update my target table Bas well as update one of the columns of my source table A(to indicate that record has been processed). When I execute the mapping, updation of target table B works, but updting my source table A does nt work and the job completes succesfully?
    Can somebody please help me to debug this?
    Thanks

    Hi,
    Please check the splitter condition for the two groups that you have set. This might be a case where you have put such a condition in splitter that all the record sets are going to the target table and none are going to the source table. Therefore the source table is not getting updated by any record.
    Regards
    -AP

  • Best approach to delete records that are not in the source table anymore.

    I have a situation where I need to remove records from dimensions that are not in the source data anymore. Right now we are not maintaing history, i.e. not using SCD but planning for the next release. If we did that it would be easy to figure the latest records. The load is nightly and records are updated and new added.
    The approach that I am considering is to join the dimension tables the the sources on keys and delete what doesn't join. However, is there perhaps some function in OWB that would allow to do this automatically on import so it can be also in place for the future?
    Thanks!

    Bear in mind that deleting dimension records becomes problematic if you have facts attached to them. Just because this record is no longer in the active set doesn't mean that it wasn't used historically, and so have foreign key constraints on it in your database. IF this is the case, a short-term solution would be to add an expiry_date field to the dimension and update the load to set this value when the record disappears rather than to delete it.
    And to do that, use the target dimension as a source table, outer join it to the actual source table on the natural key, and so your update will set expiry_date=nvl(expiry_date,sysdate) to set to sysdate if this record has not already been expired on all records where the outer join fails.
    Further consideration: what do you do if the record is re-inserted into the source table? create a new dimension key? Or remove the expiry date?
    But I will say that I am not a fan of deleting records in most circumstances. What do you do if you discover a calculation error and need to fix that and republish historical cubes? Without the historical data, you lose the ability to do things like that.

  • Any way to know the source table of a Form at  runtime

    Hi,
    Is there any way to know the source table/view.. of a form at run time.
    Thanks in advance

    Hi,
    But Get_Block_Property( ..., DML_DATA_TARGET_NAME)
    should be coded, I mean any way to know at the
    runtime, as like we get the Error from the
    Help->Display Errors.If you are asking if there is a way to get this information from the default menu, then the answer is no. However, you can create your own custom menu, or a button in your form that displays that information using the Get_Block_Property when you click on your menu item or button.

  • Custom delta extractor: All data deleted in source table in R/3

    Hi everyone,
    I have made a custom delta extractor from R/3 to a BW system. The setup is the following:
    The source table in R/3 holds a timestamp, which is used for the delta. The data is afterwards loaded to a DSO in the BW system. The extractor works as expected with delta capability. Furthermore if I delete a record in the source table, this is not transmitted to the DSO, which is also as expected.
    The issue is this however: If we delete all data in the source table, then on the next load there is a request showing 1 record transfered to the DSO. This request does, however, not show up in the PSA, and afterwards all data fields in the DSO is set to initial.
    Does anyone know why this happens?
    Thank you in advance.
    Philip R. Jarnhus

    Hi Philip,
    As you have used generic extractor I am not sure how the ROCANCEL will work but you can check the below link for more information,
    [0RECORDMODE;
    Regards,
    Durgesh.

  • Join two source tables and replicat into a target table with BLOB

    Hi,
    I am working on an integration to source transaction data from legacy application to ESB using GG.
    What I need to do is join two source tables (to de-normalize the area_id) to form the transaction detail, then transform by concatenate the transaction detail fields into a value only CSV, replicate it on the target ESB IN_DATA table's BLOB content field.
    Based on what I had researched, lookup by join two source tables require SQLEXEC, which doesn't support BLOB.
    What alternatives are there and what GG recommend in such use case?
    Any helpful advice is much appreciated.
    thanks,
    Xiaocun

    Xiaocun,
    Not sure what you're data looks like but it's possible the the comma separated value (CSV) requirement may be solved by something like this in your MAP statement:
    colmap (usedefaults,
    my_blob = @STRCAT (col02, ",", col03, ",", col04)
    Since this is not 1:1 you'll be using a sourcedefs file, which is nice because it will do the datatype conversion for you under the covers (also a nice trick when migrating long raws to blobs). So col02 can be varchar2, col03 a number, and col04 a clob and they'll convert in real-time.
    Mapping two tables to one is simple enough with two MAP statements, the harder challenge is joining operations from separate transactions because OGG is operation based and doesn't work on aggregates. It's possible you could end up using a combination of built in parameters and funcations with SQLEXEC and SQL/PL/SQL for more complicated scenarios, all depending on the design of the target table. But you have several scenarios to address.
    For example, is the target table really a history table or are you actually going to delete from it? If just the child is deleted but you don't want to delete the whole row yet, you may want to use NOCOMPRESSDELETES & UPDATEDELETES and COLMAP a new flag column to denote it was deleted. It's likely that the insert on the child may really mean an update to the target (see UPDATEINSERTS).
    If you need to update the LOB by appending or prepending new data then that's going to require some custom work, staging tables and a looping script, or a user exit.
    Some parameters you may want to become familiar with if not already:
    COLS | COLSEXCEPT
    COLMAP
    OVERRIDEDUPS
    INSERTDELETES
    INSERTMISSINGUPDATES
    INSERTUPDATES
    GETDELETES | IGNOREDELETES
    GETINSERTS | IGNOREINSERTS
    GETUPDATES | IGNOREUPDATES
    Good luck,
    -joe

Maybe you are looking for

  • 16:9 video in Encore- Can I get it letterboxed on a TV?

    There are a number of aspect ration threads I've found, but none that quite solve my issue. I have an Encore CS4 project that includes a number of video clips. All but one of them are 4:3 and play fine on a TV. But I have one video that is 16:9 (1280

  • How to save incoming email images?

    I have a Mac desktop running on Lion OS X. When someone sends me an email with photo attachment, how do I save the photo? I tried right clicking the image and in the drop down box I am prompted to either Copy image or Save Attachment. I tried Copy Im

  • Submitting Report output to spool

    Hi All,        We have problem while submitting reort output to spool.Have tried using FM'Get_print_parameters and submitting the report using stmt SUBMIT REP TO SAP-SPOOL AND RETURN.We have to submit report output to spool But its going to infinite

  • Importing Movieclips to stage from library

    Hi, im trying to import a movie clip from the library to the stage and give it an instance name but im not really sure how to do it. I exported the movie clip to actionscript and gave it a unique class name, but how can i add it to the stage and give

  • Itunes won't load after installing

    I had itunes installed on my computer in the past, but after much trouble I removed it. (Was told that it would be easy to install again.) I had built up quite up quite an extensive library for my wife and I's ipod classics. I have tried several time