Dml_Handler at the Schema Level?

Hi:
I'm using 11g R2 and doing a one-way streams replication within the same database. I've got a subset of tables within the same schema setup now for capture and I'm using dml_handlers on apply. The handlers are specified on each table with a package that takes each trapped LCR and writes its data out to a different table than the one that the LCR got captured from. This was done for a bolt-on reporting issue that popped up. This is all working great. Streams rocks!
Here's my next issue. I want to expand/morph the above approach in the following way. I want to do my capture at the schema level for all tables and also run just one dml_handler to take all LCRs for the specified schema and write them out as XML into a clob column. I've got the XML portion working and I pretty much know how I can get the streams part going as well, using a dml_handler-per-table approach similar to what I did above. What I would like to know is whether there's a way to avoid having to setup a dml_hander for each insert, update, and delete LCR on every table within the specified schema.
Instead of doing this....
BEGIN
DBMS_APPLY_ADM.SET_DML_HANDLER(
object_name => 'schema.table_a',
object_type => 'TABLE',
operation_name => 'INSERT',
error_handler => false,
user_procedure => 'package.procedure',
apply_database_link => NULL,
apply_name => 'apply_name');
END;
BEGIN
DBMS_APPLY_ADM.SET_DML_HANDLER(
object_name => 'schema.table_a',
object_type => 'TABLE',
operation_name => 'UPDATE',
error_handler => false,
user_procedure => 'package.procedure',
apply_database_link => NULL,
apply_name => 'apply_name');
END;
BEGIN
DBMS_APPLY_ADM.SET_DML_HANDLER(
object_name => 'schema.table_a',
object_type => 'TABLE',
operation_name => 'DELETE',
error_handler => false,
user_procedure => 'package.procedure',
apply_database_link => NULL,
apply_name => 'apply_name');
END;
Once for each table in the schema
I'd like to be able to do the following:
BEGIN
DBMS_APPLY_ADM.SET_DML_HANDLER(
schema_name => 'schema', ---This line is totally made up by me. The real argument is object_name
object_type => 'TABLE',
operation_name => 'ALL', ---This line is also totally made up by me. The real allowed options are Insert, Update, and Delete.
error_handler => false,
user_procedure => 'package.procedure',
apply_database_link => NULL,
apply_name => 'apply_name');
END;
Is there a way to do this? I don't see a procedure in dbms_apply_adm to accomplish it, or I just missed it. I could also do this within a loop using dynamic SQL but I'm hoping I won't have to.
Thanks for any help!
Cheers,
Mike

SCHEMA level is not possible with this procedure.
However, you can set the operation_name to 'DEFAULT' - which indicates all operations (INSERT/UPDATE/DELETE/LOB_UPDATE)

Similar Messages

  • Ask about DML Handler for Streams at the Schema level ?

    Hi all !
    I use Oracle version 10.2.0.
    I have two DB is A (at machine A, and it used as source database) and B (at machine B - destination database). Some changes from A will apply to B.
    At B, I installed oracle client to use EMC (Enterprise Manager Console) tool to generate some script, and use them to configure Streams environment, I configured Streams at the Schema level (DML and DDL) => I successed ! But I have two problems is:
    + I write a DML Handler, called "emp_dml_handler" and want set it to EMP table only. So, I must DBMS_STREAMS_ADM.ADD_TABLE_RULES ? (I configured: DBMS_STREAMS_ADM.ADD_SCHEMA_RULES) such as:
    BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
    schema_name => '"HOSE"',
    streams_type => 'APPLY',
    streams_name => 'STRMADMIN_BOSCHOSE_REGRES',
    queue_name => 'apply_dest_hose',
    include_dml => true,
    include_ddl => true,
    source_database => 'DEVELOP.REGRESS.RDBMS.DEV.US.ORACLE.COM');
    END;
    and after:
    DECLARE
    emp_rule_name_dml VARCHAR2(50);
    emp_rule_name_ddl VARCHAR2(50);
    BEGIN
    DBMS_STREAMS_ADM.ADD_TABLE_RULES(
    table_name => 'HOSE.EMP,
    streams_type => 'APPLY',
    streams_name => 'STRMADMIN_BOSCHOSE_REGRES',
    queue_name => 'apply_dest_hose',
    include_dml => true,
    include_ddl => true,
    source_database => 'DEVELOP.REGRESS.RDBMS.DEV.US.ORACLE.COM',
    dml_rule_name => emp_rule_name_dml,
    ddl_rule_name => emp_rule_name_ddl);
    DBMS_APPLY_ADM.SET_ENQUEUE_DESTINATION(
    rule_name => emp_rule_name_dml,
    destination_queue_name => 'apply_dest_hose');
    END;
    BEGIN
    DBMS_APPLY_ADM.SET_DML_HANDLER(
    object_name => 'HOSE.EMP',
    object_type => 'TABLE',
    operation_name => 'UPDATE',
    error_handler => false,
    user_procedure => 'strmadmin.emp_dml_handler',
    apply_database_link => NULL,
    apply_name => NULL);
    END;
    ... similar for INSERT and DELETE...
    I think that I only configure streams at the schema level and exclude EMP table, am i right ?
    + At the source, EMP table have a primary key. And I configured:
    ALTER TABLE HOSE.EMP ADD SUPPLEMENTAL LOG DATA (PRIMARY KEY) COLUMNS;
    ==> So, at the destination, have some works that I must configure the substitute key for EMP table ?
    Have some ideas for my problems ?
    Thanks
    Edited by: changemylife on Sep 24, 2009 10:45 PM

    If you want to discard emp from schema rule, then just add a negative rule, either on capture or apply.
    What is the purpose of :
    DBMS_APPLY_ADM.SET_ENQUEUE_DESTINATION(
    rule_name => emp_rule_name_dml,
    destination_queue_name => 'apply_dest_hose');sound like you are enqueunig into 'apply_dest_hose' all the rows for this table that comes from ... 'apply_dest_hose'
    Next you declare a DML_HANDLER that is attached to nobody :
    BEGIN
    DBMS_APPLY_ADM.SET_DML_HANDLER(
    object_name => 'HOSE.EMP',
    object_type => 'TABLE',
    operation_name => 'UPDATE',
    error_handler => false,
    user_procedure => 'strmadmin.emp_dml_handler',
    apply_database_link => NULL,
    apply_name => NULL);           <----- nobody rules the world!
    END;the sequence of evaluation is normally :
    APPLY_PROCESS (reader)
              |
              | -->  RULE SET
                          |
                          | --> RULE .....
                          | --> RULE
                                     |
                                     | --> evaluate OK then --> exist DML_HANDLER  --> YES --> call DML_HANDLER --> on LCR.execute call coordinator
                                                                                            |
                                                                                            | NO
                                                                                            |                                                                 
                                                                                       Implicit apply (give LCR to coordinator which dispatch to one apply server)    
                                                      Since your dml_handler is attached to null apply process it will never be called by anybody and your LCR for table emp will be implicit applied by its apply process.

  • EXPORT at schema level, but exclude some tables within the export

    I have been searching, but had no luck in finding the correct syntax for my situation.
    I'm simply trying to export at the schema level, but I want to omit certain tables from the export.
    exp cltest/cltest01@clprod file=exp_CLPROD092508.dmp log=exp_CLPROD092508.log statistics=none compress=N
    Thanks!

    Hi,
    Think in simple first.. you use the TABLES Clause..
    Example.
    exp scott/tiger file=empdept.expdat tables=(EMP,DEPT) log=empdept.log
    In case if you scehma contains less number of tables.. !!
    Logically if you have large number of tables, I say this solutuion might work ...all around... alternative solutions to solve the problems.. If you have hundered of tables... in your schema....
    Try to Create a New Schema and using CTAS create a tables which are skippable in the Current Scehma.
    Do an Export and once the Job Done.. you recreate the backup fom New schema
    and Import to DB (Destinaiton)
    - Pavan Kumar N

  • Schema Level Problem !

    {color:#000080}Hi,{color}
    {color:#000080}I have successfully configured the schema level replication, but some issues are front of me to struck me.
    So, I am going to explain my scenario as following.{color}
    {color:#000080}1: Site1 having oracle 10g DB and having two schema level processes: Capture and Propagate
    2: Site2 having oracle 10g DB and having two schema level processes: Capture and Propagate
    3: Central_DB having oracle10g DB and having two APPLY processes one for site1 and second for site2.{color}
    {color:#000080}I have a plan to connect the total 18 branches to our Central_DB in future.
    From all sites to Central_DB we will configure the Schema Level Replication using streams.(All sites have same schema)(DML + DDL)
    But From Central_DB to all sites we will configure Table Level Rule-Based replication.(Only DML's){color}
    {color:#000080}My Questions are:{color}
    {color:#000080}1: Please recommend me that it's a good model or not ?
    2: When site1 doing DML or DDL changes locally then changes are captured by capture process and
    After that changes are propagated from source queue(Site1) to destination queue(CEntral_DB) and
    Applied by apply process of Central_DB.
         &gt; If site1 doing changes and at the same time Central_DB goes shutdown and
    restart again then changes are applied from site1 to Central_DB.(Normal Working)
         &gt; If site1 doing changes and at the same time Central_DB goes shutdown and
    site1 doing more changes locally, after few minutes site1 also goes shutdown , Now site1 and Central_DB both are down.
         &gt; After one day when both machines are upped and running, but propagation process
         contains some connection errors and no further changes are replicated, even status of processes are enabled.{color}
    {color:#ff0000}ERROR: {color}
    {color:#ff0000}     ORA-12545: Connect failed because target host or object does not exis {color}
    {color:#000080}Please assist me to troubleshoot this problem why no further changes are replicated?{color}
    {color:#000080}Thanks,
    Fazi {color}
    {color:#000080}
    {color}
    {color:#000080}
    {color}

    That would be if an IP or DNS lookup (depending on what is specified as HOST in the tnsnames entry) is failing.
    Possibly, the lookups before the servers went down were via DNS and now the DNS is no longer available (it has gone down ?). OR that someone has recreated a hosts file and the earlier entry for the target host is missing.

  • Schema level trigger

    Hello,
    Please clarify me with respect to after logon trigger (defined at the schema level). So i wanna monitor how many times a user logs on / off. When i specify the ON SCHEMA , the trigger fires for the specific user. If i specify ON DATABASE the trigger fires for all users. But what's the difference?! Because when a user logs on, he logs on on a "schema", so no matter which user logs on, the ON SCHEMA trigger always fires.
    User A logs on, ON SCHEMA trigger fires.
    User B logs on, ON SCHEMA trigger fires.
    If i specify ON DATABASE, the trigger also fires, because the user logs on.
    So, what's the difference?
    Thanks

    Roger22 wrote:
    so , can anyone answer?You misunderstand "schema", triggers and docs are a bit misleading. When you create, for example, AFTER LOGON trigger you can specidy:
    ON DATABASEwhich means trigger will fire for every user who logs in. Or you can specify:
    ON schema_name.SCHEMAwhich means trigger will fire only when user schema_name logs in. Below is example:
    SQL> connect scott
    Enter password: *****
    Connected.
    SQL> create or replace
      2  trigger after_logon_db
      3  after logon
      4  on database
      5  begin
      6  raise_application_error(-20900,'Logion is not allowd.');
      7  end;
      8  /
    Trigger created.
    SQL> connect u1/u1
    ERROR:
    ORA-00604: error occurred at recursive SQL level 1
    ORA-20900: Logion is not allowd.
    ORA-06512: at line 2
    Warning: You are no longer connected to ORACLE.
    SQL> connect u2/u2
    ERROR:
    ORA-00604: error occurred at recursive SQL level 1
    ORA-20900: Logion is not allowd.
    ORA-06512: at line 2
    SQL> connect scott
    Enter password: *****
    Connected.
    SQL> drop trigger after_logon_db
      2  /
    Trigger dropped.
    SQL> create or replace
      2  trigger after_logon_u1
      3  after logon
      4  on u1.schema
      5  begin
      6  raise_application_error(-20900,'Logion is not allowd.');
      7  end;
      8  /
    Trigger created.
    SQL> connect u1/u1
    ERROR:
    ORA-00604: error occurred at recursive SQL level 1
    ORA-20900: Logion is not allowd.
    ORA-06512: at line 2
    Warning: You are no longer connected to ORACLE.
    SQL> connect u2/u2
    Connected.
    SQL> As you can see, ON DATABASE trigger fired for both users U1 & U2. Trigger ON U1.SCHEMA fired for user U1 only. I hope it answers your question.
    SY.

  • Creating DDL at schema level

    Hi,
    Is there a way to create DDL for all objects at the schema level from OEM?
    Thanks.

    DBMS_METADATA
    With DBMS_METADATA you can retrieve complete database object definitions (metadata) from the dictionary by specifying:
    The type of object, for example, tables, indexes, or procedures
    Optional selection criteria, such as owner or name
    Parse items (attributes of the returned objects that are to be parsed and returned separately).
    Optional transformations on the output. By default the output is represented in XML, but callers can specify transformations (into SQL DDL, for example), which are implemented by XSLT (Extensible Stylesheet Language Transformation) stylesheets stored in the database or externally.
    DBMS_METADATA provides the following retrieval interfaces:
    For programmatic use: OPEN, SET_FILTER, SET_COUNT, GET_QUERY, SET_PARSE_ITEM, ADD_TRANSFORM, SET_TRANSFORM_PARAM, FETCH_xxx and CLOSE retrieve multiple objects.
    For use in SQL queries and for browsing: GET_XML and GET_DDL return metadata for a single named object. The GET_DEPENDENT_XML, GET_DEPENDENT_DDL, GET_GRANTED_XML, and GET_GRANTED_DDL interfaces return metadata for one or more dependent or granted objects.
    This chapter discusses the following topics:
    Summary of DBMS_METADATA Subprograms
    http://download-west.oracle.com/docs/cd/B10501_01/appdev.920/a96612/d_metada.htm#ARPLS026
    Joel P�rez

  • Sequence nextval after an schema level export import

    If I export a schema that has some sequences and then truncate tables and then import the schema, do I get theh old sequence values?
    I guess my question is do sequences get stored at the schema level or the database level.
    I noticed that sequences are exported at the schema level when I do an export so that may be the answer but your confirmation would be appreciated.

    Hi,
    Nothing to worry,imp/exp does not change the value of Nextval.you can use truncate table after exp, then u can import it.
    Regards
    Vinay Agarwal
    OCP

  • "Setup encountered a problem while validating the state of Active Directory: Exchange organization-level objects have not been created, and setup cannot create them because the local computer is not in the same domain and site as the schema master. Run se

    Team,
    I am trying to Install Exchange on my Lab, getting below error
    message.
    The Schema Role is installed on Root Domain and trying to install
    exchange on Child domain.
    1 Root Domain - 1 Child domain. both are located on single site.
    “Setup encountered a problem while validating
    the state of Active Directory: Exchange organization-level objects have not been created, and setup cannot create them because the local computer is not in the same domain and site as the schema master. Run setup with the /prepareAD parameter and wait for
    replication to complete.”
    Followed below articles:
    http://support.risualblogs.com/blog/2012/02/21/exchange-2010-sp2-upgrade-issue-exchange-organization-level-objects-have-not-been-created-and-setup-cannot-create-them-because-the-local-computer-is-not-in-the-same-domain-and-site-as-the-sche/
    http://www.petenetlive.com/KB/Article/0000793.htm
    transferred the schema roles to different server on root domain, still no luck.
    can someone please help me.
    regards
    Srinivasa k
    Srinivasa K

    Hi Srinivasa,
    I guess, you didn't completed the initial setup schemaprep and adprep before starting the installation. You can do it as follows:
    1. Open command Prompt as administrator and browse to the root of installation cd and run Setup.exe /PrepareSchema /IAcceptExchangeServerLicenseTerms
    After finishing this,
    2. Setup.exe /PrepareAD /OrganizationName:"<organization name>" /IAcceptExchangeServerLicenseTerms
    3. To prepare all domains within the forest run Setup.exe /PrepareAllDomains /IAcceptExchangeServerLicenseTerms. If you want to prepare a specific domain run Setup.exe /PrepareDomain:<FQDN of the domain you want to prepare> /IAcceptExchangeServerLicenseTerms
    4. Once you complete all of the 3 steps, install the pre-requisities for Exchange 2013
    5. Finally, run the setup program
    Hope this will help you
    Regards from Visit ExchangeOnline |
    Visit WindowsAdmin

  • How to use the 4 level Categorisation Schema

    Hi expert,
    We assign the subject profile to the Schema for Multilevel Categorization  in the service order.
    But for the subject profile, there are only catlogs/code group/code three levels.
    How to use more  3 three levels Multilevel Categorization ??
    Could Schema be assign to service order directly?

    hi,
    follow following steps:
    In SPRO
    1. Customer Relationship Management >> Interaction Center WebClient >> Business Transactions >> Define Categorization Profiles
    2. Assign this to the web-client profile in Categorization
    3. Default Business transaction type: Make the transaction type you are using for service ticket/ service order default in Business Transaction profile
    4. Customer Relationship Management >> CRM Cross-Application Components >> Multilevel Categorization >> Define Application Areas for Categorization >> check the basis of categorization. Select and go to Parameters. It should generally by the Subject Profile.
    5. Customer Relationship Management >> Transactions >> Settings for Complaints >> Settings for Subjects >>
    a.Define Catalogs
    b.Define Code Groups and Codes for Catalogs
    c.Define Code Group Profiles
    d.Define Subject Profiles
    e.Assign Subject Profiles to Transaction Types
    In Easy Access
    1. Create a short-cut for CRM BSP Application for CRMM_ERM_CAT. A new short-cut will be prepared in your favorites. This will launch the Category Modeler in a browser window. This needs Java.
    2. Once launched Create a new Schema
    3. Assign the schema to your categorization schema
    4. In General Data tab provide the Subject Profile that has been created.
    5. In Categories create the hierarchy as per your business logic and assign the specific code belonging to the code group profile in the subject profile
    6. Once done, come to General Tab, add 10-15 minutes in the current system time and input this data in Valid from time. This is the time after which the Cat. Mod. Will be activated once released.
    7. Change the status from Draft to Released. Depending upon u2018From Dateu2019 and u2018Valid From Timeu2019, the category modeler will be activated. This can be seen in the Service Order/ Service Ticket that you have specified, below the description.
    Hope this helps.
    Best regards
    Pankaj Kumar

  • How can we take the incremental export using the datapump at schema level.

    Hi,
    How can we take the incremental export using the datapump at schema level. Example, today i had taken the full export of one schema.
    After 7 days from now , how can i take the export of change data , from the data of my full export.
    using the export datapump parameter - FLASHBACK_TIME ,can we mention the date range. Example sysdate - (sysdate -7)
    Please advice
    thanks
    Naveen.

    Think of the Data Pump Export/Import tools as taking a "picture" or "snapshot."
    When you use these utilities it exports the data as it appears (by default) while it's doing the export operation. There is no way to export a delta in comparison to a previous export.
    The FLASHBACK_TIME parameter allows you to get a consistent export as of a particular point in time.
    I recommend you go to http://tahiti.oracle.com. Click on your version and go to the Utilities Guide. This is where all the information on Data Pump is located and should answer a lot of your questions.

  • Report Designer error i VS 2013 - "Data at the root level is invalid."

    Hi,
    I am trying to create rdlc file programmatically. Using Memory Table as dataset. Here is my code
    ' For each field in the resultset, add the name to an array listDim m_fields AsArrayList
      m_fields = NewArrayList()
      Dim i AsIntegerFor i = 0 To tbdataset.Tables(0).Columns.Count - 1
          m_fields.Add(tbdataset.Tables(0).Columns(i).ColumnName.ToString)
      Next i
      'Create Report 'http://schemas.microsoft.com/sqlserver/reporting/2008/01/reportdefinition'http://schemas.microsoft.com/sqlserver/reporting/2010/01/reportdefinition' Open a new RDL file stream for writingDim stream AsFileStream
      stream = File.OpenWrite("D:\MyTestReport2.rdlc")
      Dim writer AsNewXmlTextWriter(stream, Encoding.UTF8)
      ' Causes child elements to be indented
      writer.Formatting = Formatting.Indented
      ' Report element
      writer.WriteProcessingInstruction("xml", "version=""1.0"" encoding=""utf-8""")
      writer.WriteStartElement("Report")
      writer.WriteAttributeString("xmlns", Nothing, "http://schemas.microsoft.com/sqlserver/reporting/2010/01/reportdefinition")
      writer.WriteAttributeString("xmlns:rd", "http://schemas.microsoft.com/SQLServer/reporting/reportdesigner")
      writer.WriteStartElement("ReportSections")
      writer.WriteStartElement("ReportSection")
      writer.WriteElementString("Width", "11in")
      writer.WriteStartElement("Body")
      writer.WriteElementString("Height", "5in")
      writer.WriteStartElement("ReportItems")
      writer.WriteStartElement("Tablix")
      writer.WriteAttributeString("Name", Nothing, "Tablix1")
      writer.WriteElementString("Top", ".5in")
      writer.WriteElementString("Left", ".5in")
      writer.WriteElementString("Height", ".5in")
      writer.WriteElementString("Width", (m_fields.Count * 1.5).ToString() + "in")
      writer.WriteStartElement("TablixBody")
      ' Tablix Columns
      writer.WriteStartElement("TablixColumns")
      ForEach fieldName In m_fields
          writer.WriteStartElement("TablixColumn")
          writer.WriteElementString("Width", "1.5in")
          writer.WriteEndElement() ' TableColumnNext fieldName
      writer.WriteEndElement() ' TablixColumns' Header Row
      writer.WriteStartElement("TablixRows")
      writer.WriteStartElement("TablixRow")
      writer.WriteElementString("Height", ".25in")
      writer.WriteStartElement("TablixCells")
      ForEach fieldName In m_fields
          writer.WriteStartElement("TablixCell")
          writer.WriteStartElement("CellContents")
          writer.WriteStartElement("Textbox")
          writer.WriteAttributeString("Name", Nothing, "Header" + fieldName)
          ' writer.WriteAttributeString("CanGrow",  True)' writer.WriteAttributeString("Keeptogether", True)
          writer.WriteStartElement("Paragraphs")
          writer.WriteStartElement("Paragraph")
          writer.WriteStartElement("TextRuns")
          writer.WriteStartElement("TextRun")
          writer.WriteElementString("Value", fieldName)
          writer.WriteStartElement("Style")
          writer.WriteElementString("TextDecoration", "Underline")
          writer.WriteElementString("PaddingTop", "0in")
          writer.WriteElementString("PaddingLeft", "0in")
          writer.WriteElementString("LineHeight", ".5in")
          ''writer.WriteElementString("Width", "1.5in")''writer.WriteElementString("Value", fieldName)
          writer.WriteEndElement() ' Style
          writer.WriteEndElement() ' TextRun
          writer.WriteEndElement() ' TextRuns
          writer.WriteEndElement() ' Paragraph
          writer.WriteEndElement() ' Paragraphs
          writer.WriteEndElement() ' TexBox
          writer.WriteEndElement() ' CellContents
          writer.WriteEndElement() ' TablixCellNext
      writer.WriteEndElement() ' TablixCells
      writer.WriteEndElement() ' TablixRow'writer.WriteEndElement() ' TablixRows          Do not close Rows tag here colse it after details'End of Headers'Details Rows'writer.WriteStartElement("TablixRows")         Since Rows tag in header is not closed not need to open fresh tag
      writer.WriteStartElement("TablixRow")
      writer.WriteElementString("Height", ".25in")
      writer.WriteStartElement("TablixCells")
      ForEach fieldName In m_fields
          writer.WriteStartElement("TablixCell")
          writer.WriteStartElement("CellContents")
          writer.WriteStartElement("Textbox")
          writer.WriteAttributeString("Name", Nothing, fieldName)
          '  writer.WriteAttributeString("CanGrow", True)'  writer.WriteAttributeString("Keeptogether", True)
          writer.WriteStartElement("Paragraphs")
          writer.WriteStartElement("Paragraph")
          writer.WriteStartElement("TextRuns")
          writer.WriteStartElement("TextRun")
          'writer.WriteElementString("Value", fieldName)
          writer.WriteElementString("Value", "=Fields!" + fieldName + ".Value")
          writer.WriteStartElement("Style")
          writer.WriteElementString("TextDecoration", "Underline")
          writer.WriteElementString("PaddingTop", "0in")
          writer.WriteElementString("PaddingLeft", "0in")
          writer.WriteElementString("LineHeight", ".5in")
          ''writer.WriteElementString("Width", "1.5in")''writer.WriteElementString("Value", fieldName)
          writer.WriteEndElement() ' Style
          writer.WriteEndElement() ' TextRun
          writer.WriteEndElement() ' TextRuns
          writer.WriteEndElement() ' Paragraph
          writer.WriteEndElement() ' Paragraphs
          writer.WriteEndElement() ' TexBox
          writer.WriteEndElement() ' CellContents
          writer.WriteEndElement() ' TablixCellNext
      writer.WriteEndElement() ' TablixCells
      writer.WriteEndElement() ' TablixRow
      writer.WriteEndElement() ' TablixRows'End of Details Rows
      writer.WriteEndElement() ' TablixBody
      writer.WriteStartElement("TablixRowHierarchy")
      writer.WriteStartElement("TablixMembers")
      writer.WriteStartElement("TablixMember")
      ' Group
      writer.WriteElementString("KeepWithGroup", "After")
      writer.WriteEndElement() ' TablixMember' Detail Group
      writer.WriteStartElement("TablixMember")
      writer.WriteStartElement("Group")
      writer.WriteAttributeString("Name", Nothing, "Details")
      writer.WriteEndElement() ' Group
      writer.WriteEndElement() ' TablixMember
      writer.WriteEndElement() ' TablixMembers
      writer.WriteEndElement() ' TablixRowHierarchy
      writer.WriteStartElement("TablixColumnHierarchy")
      writer.WriteStartElement("TablixMembers")
      'writer.WriteStartElement("TablixMember")ForEach fieldName In m_fields
          writer.WriteStartElement("TablixMember")
          writer.WriteEndElement() ' TablixMemberNext' writer.WriteEndElement() ' TablixMember
      writer.WriteEndElement() ' TablixMembers
      writer.WriteEndElement() ' TablixColumnHierarchy
      writer.WriteElementString("DataSetName", "tbdataset")
      writer.WriteEndElement() ' Tablix
      writer.WriteEndElement() ' ReportItems
      writer.WriteEndElement() ' Body
      writer.WriteStartElement("Page")
      ' Page Header Element
      writer.WriteStartElement("PageHeader")
      writer.WriteElementString("Height", "1.40cm")
      writer.WriteStartElement("ReportItems")
      writer.WriteStartElement("Textbox")
      writer.WriteAttributeString("Name", Nothing, "Textbox1")
      writer.WriteStartElement("Paragraphs")
      writer.WriteStartElement("Paragraph")
      writer.WriteStartElement("TextRuns")
      writer.WriteStartElement("TextRun")
      writer.WriteElementString("Value", Nothing, "ABC CHS.")
      writer.WriteEndElement() ' TextRun
      writer.WriteEndElement() ' TextRuns
      writer.WriteEndElement() ' Paragraph
      writer.WriteEndElement() ' Paragraphs
      writer.WriteEndElement() ' TextBox
      writer.WriteEndElement() ' ReportItems
      writer.WriteEndElement() ' PageHeader
      writer.WriteEndElement() ' Page
      writer.WriteEndElement() ' ReportSection
      writer.WriteEndElement() ' ReportSections' DataSources
      writer.WriteStartElement("DataSources")
      writer.WriteStartElement("DataSource")
      writer.WriteAttributeString("Name", Nothing, "tbdata")
      writer.WriteStartElement("DataSourceReference")
      writer.WriteEndElement() ' DataSourceReference
      writer.WriteEndElement() ' DataSource
      writer.WriteEndElement() ' DataSources'DataSet
      writer.WriteStartElement("DataSets")
      writer.WriteStartElement("DataSet")
      writer.WriteAttributeString("Name", Nothing, "tbdataset")
      writer.WriteStartElement("Query")
      writer.WriteElementString("DataSourceName", Nothing, "tbdata")
      'writer.WriteElementString("CommandText", Nothing, "/* Local Query */")
      writer.WriteElementString("CommandText", Nothing, "TableDirect")
      writer.WriteEndElement() ' Query'Fields
      writer.WriteStartElement("Fields")
      ForEach fieldName In m_fields
          writer.WriteStartElement("Field")
          writer.WriteAttributeString("Name", Nothing, fieldName)
          writer.WriteElementString("DataField", fieldName)
          writer.WriteElementString("rd:TypeName", fieldName.GetType.ToString)
          writer.WriteEndElement() ' FieldNext
      writer.WriteEndElement() ' Fields' rd datasetinfo
      writer.WriteEndElement() ' DataSet
      writer.WriteEndElement() ' DataSets
      writer.WriteEndElement() ' Report' Flush the writer and close the stream
      writer.Flush()
      stream.Close()
      'Convert to StreamDim myByteArray AsByte() = System.Text.Encoding.UTF8.GetBytes("D:\MyTestReport2.rdlc")
      Dim ms AsNewMemoryStream(myByteArray)
      'Supply Stream to ReportViewer
      ReportViewer1.LocalReport.LoadReportDefinition(ms)
      ReportViewer1.LocalReport.Refresh()When I open rdlc in designer I get following error"Data at the root level is invalid."When I run the aspx I get following error
    An error occurred during local report processing.
    The definition of the report '' is invalid.
    The definition of this report is not valid or supported by this version of Reporting Services.
    The report definition may have been created with a later version of Reporting Services, or contain content that is not well-formed or not valid based on Reporting Services schemas.
    Details: Data at the root level is invalid. Line 1, position 1.
    Can anybody guide me?

    Hi RACES,
    Based on your description, I'm afraid that it is not the correct forum for this issue, but maybe I could help you find a more appropriate forum.
    If it is related to the Visual Studio Report Controls, you could select this forum:
    https://social.msdn.microsoft.com/Forums/en-US/home?forum=vsreportcontrols
    But if it is the SSRS issue, maybe this forum would be better for you:
    https://social.msdn.microsoft.com/Forums/sqlserver/en-US/home?forum=sqlreportingservices
    Best Regards,
    Jack
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • How to obtain the schema contents using XDK

    I have a requirement to access the entire schema - all elements, all simple types, complex types, annotations, etc. with the schema processor of the XDK in Java. I am wondering how to do this. All I get are the top level elements. I am not too sure how to browse each such entity. I cannot get things like hasChildren() and getChildren() (like in DOM tree nodes) in XSDNode or its implementations (XSDElement, XSDComplexType, etc). Could anyone help please?

    XMLSchema schema;
    Hashtable hashtable=schema.getXMLSchemaNodeTable();
    http://otn.oracle.com/tech/xml/xdk/doc/beta/doc/java/javadoc/oracle/xml/parser/schema/XMLSchemaNode.html

  • How to add new tables in Streams for Schema level replication ( 10.2.0.3 )

    Hi,
    I am in process of setting up Oracle Streams schema level replication on version 10.2.0.3. I am able to setup replication for one table properly. Now I want to add 10 more new tables for schema level replication. Few questions regarding this
    1. If I create new tables in source, shall I have to create tables in target database manually or I have to do export STREAMS_INSTANTIATION=Y
    2. Can you tell me metalink note id to read more on this topic ?
    thanks & regards
    parag

    The same capture and apply process can be used to replicate other tables. Following steps should suffice your need:
    Say table NEW is the new table to be added with owner SANTU
    downstr_cap is the capture process which is already running
    downstr_apply is the apply process which is already there
    1. Now stop the apply process
    2. Stop the capture process
    3. Add the new table in the capture process using +ve rule
    BEGIN
    DBMS_STREAMS_ADM.ADD_TABLE_RULES
    table_name      => 'SANTU.NEW',
    streams_type    => 'capture',
    streams_name    => 'downstr_cap',
    queue_name      => 'strmadmin.DOWNSTREAM_Q',
    include_dml     => true,
    include_ddl     => true,
    source_database =>  ' Name of the source database ',
    inclusion_rule  => true
    END;
    4. Take export of the new table with "OBJECT_CONSISTENT=Y" option
    5. Import the table at destination with "STREAMS_INSTANTIATION=Y' option
    6. Start the apply process
    7. Start the capture process

  • SQLServer 2008 R2: The SCHEMA LOCK permission was denied on the object

    Hi all,
    I encounter the following error while developing a SSRS project that connects to a SQL Server Database view:
    "Msg 229, Level 14, State 71, Procedure sp_getschemalock, Line 1
    The SCHEMA LOCK permission was denied on the object 'Table4', database 'DBRemote', schema 'dbo'."
    That view uses a linked server to select data from a remote SQL Server Database (SQL Server 2005).
    There are no sql hints specified in my views
    My view T-SQL is:
    Select
    From linksv.DBRemote.dbo.Table1 T1
    Inner Join linksv.DBRemote.dbo.Table2 T2 On
      T1.fk1 = T2.pk1
    Inner Join view1 v1 On
      T2.fk2 = v1.pk2
    My t-sql for view1 is:
    Select
    From linksv.DBRemote.dbo.Table3 T3 
    Inner Join linksv.DBRemote.dbo.Table4 T4 On
      t3.fk1 = T4.pk1
    The object specified in error message above refers to Table "linksv.DBRemote.dbo.Table4" (see view above)
    SQL Server Permissions are set for all objects involved in the queries above.
    The funny thing is that the error occurs when I run my report from the report server webinterface
    and my report project is loaded in BIDS at the same time.
    The error occurs when I execute the query in SSMS 2008 and also when I run the query
    in BIDS 2008 Query designer.
    I also wondering why the error referes to the "linksv.DBRemote.dbo.Table4" remote object only
    but not to the other remote objects in that query.
    Im not sure where to look any further on what might cause this error.
    Appreciate any help very much.
    Thanks
    Bodo

    yes, this error happens because the login that is mapped on the second side of the linked-server is missing the read permission on the object. All queries done trhough linked-server will acquire a schema-lock just so that SQL can read the results correctly.
    I don't know exactly WHY it does it this way, but it does.
    to fix the error message, give the required permission to the login on the server that is target of the linked-server configuration - be it Windows Authentication, SQL Login, or connections "made using the logins current security context". The preferable
    way is to map 1-to-1 every login that will be used in the Security tab of the Linked Server Properties page.
    I made a post about this:
    http://thelonelydba.wordpress.com/2013/04/17/sql-and-linked-servers-the-schema-lock-permission-was-denied-on-the-object/

  • Getting the schema using Oracle SQL Developer 1.5.1

    I need to generate a report in CSV/XLS format using Oracle SQL Developer 1.5.1 to get the schema details
    in the below format.
    Table Name: XXXXXXXXX
    Table Space Name: XXXXXXXXXXXXXXX
    Structure :
    Field Name Data Type Size
    Xxxxxxxx xxxxxx xxxxx
    Xxxxxxxx xxxxxx xxxxx
    Xxxxxxxx xxxxxx xxxxx
    Xxxxxxxx xxxxxx xxxxx
    Field level constraint:
    Xxxxxxxxxxxxxxxxxx
    Table Level Constraint:
    Xxxxxxxxxxxxxxxxxx
    Indexes:
    Index Name Column Name(s) Table space name
    Xxxxxxxx xxxxxxxxx xxxxxxxxxxxx
    Xxxxxxxx xxxxxxxxx xxxxxxxxxxxx
    Xxxxxxxx xxxxxxxxx xxxxxxxxxxxx
    Sequence Number:
    Xxxxxxxxxxxxxxxx
    Triggers:
    Xxxxxxxxxxxxxxxxxx
    I am using a query to do that, but I cannot run more than one queries in the Create Report Dialog.
    I went to the Menu->View->Reports. In that, I chose User Defined Reports, created a new folder called Schema Detail and in that a report named Schema Detail. I right click on that report, choose Edit option, Create Report Dialog opens.
    In the Create Report Dialog, I chose the option Script for Value of Style.
    In the SQL column I enter a query "select * from table_names". But, if I enter another query after that, it does not run and reports an error "SQL Error: ORA-00933: SQL command not properly ended
    00933. 00000 - SQL command not properly ended" Cause:    Action: "
    I end the first query with a semi colon, but it does not work.
    The queries run fine in a SQL Worksheet. There, I can terminate the first query with a semi colon and enter the second query. Then, I run both of them together so that result-set of second query appears after the second.
    Am I doing something wrong? Can someone please advise on how to get the information I need?
    Thanks a lot.

    You can do it in SQL Worksheet because you are running a script. You could run the script in SQL*Plus, and even start it from a bat/cmd file.
    Or if you want this as a User Defined Report, you might make it a PL/SQL style report. Here, all output is done through DBMS_OUTPUT.PUT or DBMS_OUTPUT.PUT_LINE. You can output HTML tags or plain text. I have an example in my paper for ODTUG Kaleidoscope 2009.

Maybe you are looking for

  • Windows Server 2012 VSS error "The system writer is missing"

    Hello All, I have a HP server that I have just installed Server 2012 standard, SQL Server 2012 and Sharepoint 2013. When I try to take bare metal backup I get the error: "The system writer is missing", I have tried the backup as the local admin and d

  • I bought an iTunes gift card, I scratched off the code too hard...now I can't see it.

    I bought an iTunes gift card, while scratching off the code...I scratched it too hard, I cant read my code now...! What do I do?

  • ICloud won't open on my PC

    Everything is up to date. I followed the instructions from Apple on how to setup iCloud. I got into Network and Internet (running Windows 7 home premium ASUS PC LAPTOP) and i click and i click and i click and it doesnt open. Except everything else wi

  • Question about brightening images and automating layers

    I've been using Photoshop CS5 for a while now but there are still a few things I'm wondering about. The first is about brightening an image without adding or reducing too much contrast at the same time.  If I increase the Brightness slider in the Bri

  • Tablespace error -- ORA-01653

    I'm trying to insert into a table and i'm getting error, ORA-01653.ORA-01653: unable to extend table name.name by 1024 in tablespace ATREPORT. i HAVE ADDED A NEW DATAFILE TO THE TABLESPACE BUT I'M GETTING THE SAME ERROR