Ad Hoc Distributed Queries

Hello Experts
I was trying to enable my 'ad hoc distributed queries' using
sp_configure 'show advanced options', 1;
RECONFIGURE;
sp_configure 'Ad Hoc Distributed Queries', 1;
RECONFIGURE;
GO
but I am getting the following error please advise, thank you.
Msg 5833, Level 16, State 1, Line 1
The affinity mask specified is greater than the number of CPUs supported or licensed on this edition of SQL Server

Hi,
Why duplicate post please avoid this
http://social.technet.microsoft.com/Forums/en-US/2ebf1d6e-ffe3-41bf-b741-5f1e4f08f46e/ad-hoc-distributed-queries?forum=sqldatabaseengine
Please refer to masrked answer in below thread
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/700bd948-1a1d-4912-ac6d-723d4478bd55/license-issues-when-virtualizing-a-sql-2008-onto-windows-2003?forum=sqlsetupandupgrade
Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it.
My TechNet Wiki Articles

Similar Messages

  • Xml data type is not supported in distributed queries. Remote object 'OPENROWSET' has xml column(s).

    Hi,
    Can anyone help me out please.
    I have written one stored Procedure to create a views using Openrowset(openquery) but for tables which contains xml data types throwing error while executing the SP. Error
    " Xml data type is not supported in distributed queries. Remote object 'OPENROWSET' has xml column(s)."
    Please refer the Stored Procedure & error message below.
    USE [Ice]
    GO
    /****** Object:  StoredProcedure [dbo].[Pr_DBAccess]    Script Date: 08/14/2014 16:08:20 ******/
    SET
    ANSI_NULLS ON
    GO
    SET
    QUOTED_IDENTIFIER ON
    GO
    ALTER
    PROCEDURE [dbo].[ Pr_DBAccess](@SERVERTYPE
    NVARCHAR(50),@SERVERNAME
    NVARCHAR(100),@DATABASENAME
    NVARCHAR(100),@SCHEMANAME
    NVARCHAR(100),@TABLENAME
    NVARCHAR(100),@USERNAME
    NVARCHAR(100),@PASSWORD
    NVARCHAR(100))
    AS
    BEGIN
    DECLARE @openquery
    NVARCHAR(4000),
    @ETL_CONFIG_IDN
    NVARCHAR(100);
     IF @SERVERTYPE='SQL'
     BEGIN
    SET @openquery= 
    'CREATE VIEW '+@TABLENAME+
    ' WITH ENCRYPTION AS SELECT * FROM OPENROWSET(''SQLNCLI'',''SERVER='+@SERVERNAME+';TRUSTED_CONNECTION=YES;'',''SELECT * FROM '+@DATABASENAME+'.'+@SCHEMANAME+'.'+@TABLENAME+''')'
    SELECT @openquery
    END
    EXECUTE
    sp_executesql @openquery
    END
    ----While running the SP manually below error occured

    HI ,
    1. You cannot use a table or view that contains xml or clr type as 4-part name in your query
    2. You need to cast the column to either nvarchar(max) or varbinary(max) or other appropriate type to use
    3. If you have a table that has xml type for example then you need to create a view that contains all columns other than xml and query it instead. Or you can issue a pass-through query using OPEN QUERY with the appropriate columns only.
    Here is a work around:
    SELECT
          Cast(a.XML_Data as XML) as XML_Data
    FROM
          OPENQUERY([LINKED SERVER NAME HERE],'
              SELECT
                Cast(XML_Data as Varchar) as XML_Data
             FROM
                [DATABASE NAME].[SCHEMA].[TABLE NAME]'
    ) a
    Basically, the data is queried on the remote server, converts the XML data to a varchar, sends the data to the requesting server and then reconverts it back to XML.
    You can take help from below link;
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/c6e0f4da-821f-4ba2-9b01-c141744076ef/xml-data-type-not-supported-in-distributed-queries?forum=transactsql
    Thanks

  • OLE DB provider 'MSOLAP' cannot be used for distributed queries because the provider is configured to run in single-threaded

    Hopefully this will save somebody some trouble.
    Running 64bit Enterprise SQL and SSAS with Service pack 2 installed.
    Also running Proclarity so 32bit mode Reporting Services is running.
    When trying to create a linked server to my OLAP database I was continually getting the following Error:
    OLE DB provider 'MSOLAP' cannot be used for distributed queries because the provider is configured to run in single-threaded apartment mode. (Microsoft SQL Server, Error: 7308)
    Many posts suggested I select the "in Proc" check box under the olap provider, but this did not help.
    Finally, instead of using the IDE to create the linked server I used a script to call sp_addlinkedserver and used @provider='MSOLAP.3'.  This fixed the problem.
    If you have a more clear idea of why I was having the issue in the first place, feel free to let me know what you think.

    Try this thread:
    https://social.msdn.microsoft.com/Forums/sqlserver/en-US/f02a3921-7d0b-4038-97bb-9f17d381b345/linked-ssas-server-fails?forum=sqlanalysisservices
    Talk to me now on

  • Facing problem in distributed queries over Oracle linked server

    Hi,
    I have a SQL Server 2005 x64 Standard Edition SP3 instance. On the instance, we had distributed (4 part) queries over an Oracle linked server running fine just a few hours back but now they have starting taking too long. They seem to work fine when OPENQUERY
    is used. Now I have a huge number of queries using the same mechanism and is not feasible for me to convert all queries to OPENQUERY, please help in getting this resolved.
    Thanks in advance.

    Hi Ashutosh,
    According to your description, you face performance issues with distributed queries and
    it is not feasible for you to convert all queries to
    OPENQUERY. To improve the performance, you could follow the solutions below:
    1. Make sure that you have a high-speed network between the local server and the linked server.
    2. Use driving_site hint. The driving site hint forces query execution to be done at a different site than the initiating instance. 
    This is done when the remote table is much larger than the local table and you want the work (join, sorting) done remotely to save the back-and-forth network traffic. In the following example, we use the driving_site hint to force the "work"
    to be done on the site where the huge table resides:
    select /*+DRIVING_SITE(h)*/
    ename
    from
    tiny_table t,
    huge_table@remote h
    where
    t.deptno = h.deptno;
    3. Use views. For instance, you could create a view on the remote site referencing the tables and call the remote view via the local view as the following example.
    create view local_cust as select * from cust@remote;
    4. Use procedural code. In some rare occasions it can be more efficient to replace a distributed query by procedural code, such as a PL/SQL procedure or a precompiler program.
    For more information about the process, please refer to the article:
    http://www.dba-oracle.com/t_sql_dblink_performance.htm
    Regards,
    Michelle Li

  • Linked Server and Distributed Queries  in Oracle

    In MSSQL, Linked Server and Distributed Queries provide SQL Server with access data from remote data sources. How about in Oracle ?
    I have a table A at Server A and table B at Server B, i wanna join these two table together. How can i do this in Oracle ?

    Use a database link: http://www.stanford.edu/dept/itss/docs/oracle/10g/server.101/b10759/statements_5005.htm
    For instance, if you have created on database A a link to database B with name 'database_b'
    you can use
    select * from table@database_b

  • Distributed queries+pipelined table function

    HI friends,
    can i get better performance for distributed queries if i use pipelined table function.I have got my data distribued across three different databases.
    thanx
    somy

    You will need to grant EXECUTE access on the pipelined table function to whatever users want it. When other users call this function, they may need to prefix the schema owner (i.e. <<owner>>.getValue('001') ) unless you've set up the appropriate synonym.
    What version of SQL*Plus do you have on the NT machine?
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Distributed Queries - Index Server

    Using Kodo, is it possible to perform distributed queries. that is to
    combine a standard tabular SQL query and one which queries an index server?
    I suppose the real question is, is it possible to query a full text index
    using Kodo?

    Kodo doesn't provide any built-in support for querying a separate
    server for a particular JDOQL query. If the index server has a JDBC API,
    then it wouldn't be too difficult to issue the query using a separate
    PMF for the index server, and then manually join the results to get back
    the appropriate objects from the main database.
    There are also a bunch of interesting things you can do with custom
    field/class mappings; you might want to investigate these APIs
    (preferrably in 3.0, where they are more sophisticated).
    Finally, the next release of 3.0 will contain a new "textindex" sample,
    which demonstrates how you might roll your own full text index purely in
    JDO.
    In article <boeo0r$s67$[email protected]>, BD wrote:
    Using Kodo, is it possible to perform distributed queries. that is to
    combine a standard tabular SQL query and one which queries an index server?
    I suppose the real question is, is it possible to query a full text index
    using Kodo?--
    Marc Prud'hommeaux [email protected]
    SolarMetric Inc. http://www.solarmetric.com

  • Distributed Queries w/interMedia

    Does interMedia support simple distributed queries such as the following:
    select doc_id from doc_table@dblink where contains(text,'November',0)>0;
    null

    <BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by Paul Dixon ([email protected]):
    This does not work so far in >= 8i.<HR></BLOCKQUOTE>
    I finally figured this one out. Add @dblink between "contains" and the "(". Works fine like this against 8.1.7.
    null

  • Are iPhone ad-hoc distributed apps updatable through App Store or iTunes?

    Hello
    Say I ad-hoc distributed my beta to some testers.
    If I then later release the app to the App Store, will the testers be able to update their ad-hoc versions to the released version (either through App Store Updates, or iTunes updates)?
    Further question is if they are able to do so, would they have to pay for it if I am charging for it on the App Store?
    I've been searching for this answer for quite some time, any input is greatly appreciated.
    Thank you.

    Unfortunately that's not possible. iTunes sees them as two different applications - even if you use the same bundle ID, name and certificate. The problem is the code signing. App Store apps are signed by Apple and you. Ad hoc distributed apps are signed only by you.
    When your app is updated in iTunes, it will run alongside any version that has been installed via ad hoc distribution.

  • Tuning Distributed Queries

    Hello to all
         I have new senior plz help me out how to handle or what tool technique is appropriate for this senior.
    My senior related to distributed quires. I want to tune this process. My working environment is windows server 2003 with 11g.
    I run query form one server which involve db_link to extract data from two other databases. But I observe that query runs on first node and then 2nd node. But I want to run this process parallel. I want both servers utilize at same time but using db_link in query not solve the problem. Can oracle parallel server would be the solution or another. Plz help me which architecture would appropriate in this senior
    regards
    Adeel

    Yes i have started this thread. I don't know what is the problem with My Userid of OTN. Sometime it show ORACLE student and sometime ID.
    -Rem sum(store_code) calculated on Local server Node0
    select sum(store_code) from
    --REM this Query Runs on one server say Node1*
    select * from PK0113_jt@D2_9_2010634013255218206180
    inner join PK0113_dg0@D2_9_2010634013255218206180 on PK0113_dg0.dg0_id=PK0113_jt.dg0_id
    where store_code=113
    union all
    REM REM this Query Runs on one server say Node2*
    select * from PK113S_jt@D2_12_2010634015728553161367
    inner join PK113S_dg0@D2_12_2010634015728553161367 on PK113S_dg0.dg0_id=PK113S_jt.dg0_id
    where store_code=102
    ) t
    i have send this request from Local server to two stars whose data reside on two different server. in my case first request goes to node 1 and then node2 finally return answer. and execution plan shows Remote in plan.

  • Optimizing Distributed Queries

    Hello All,
    We have a serious problem optimizing a job that fetches data ( 200k + rows) through views that reside in remote db and inserts into local table. We are running this job from Local db and due to contraints we can neither create any object in the remote db nor we have select access to the tables in remote db .We were given grant to select from the remote views built on remote base tables. How to optimize the job .We tried 2 methods neither was faster ( one hour plus min....)
    example 1: (using driving_site hint & append hint)
    begin
    for irec in (Select /*+ driving_site (c)(d) */) c.customer_id,c.customer_name,d.dept_id,d.dept_name
    from
    customers_view@remotedb c,departments_view@remotedb d
    where d.unique_id = c.unique_id)
    loop
    insert /*+append */
    into local_table ( cust_id,cust_name,dept_id,dept_name)
    values
    ( rec.customer_id,
    rec.customer_name,
    rec.dept_id,
    rec.dept_name);
    end loop;
    commit;
    end;
    example 2: (conventional insert with append hint and driving_site will not work here )..
    insert /*+append */
    into local_table ( cust_id,cust_name,dept_id,dept_name)
    Select c.customer_id,c.customer_name,d.dept_id,d.dept_name
    from
    customers_view@remotedb c,departments_view@remotedb d
    where d.unique_id = c.unique_id)
    Limitations :
    1) we do not have privilage to run explain plan for the remote objects..:(. So whatever we do we have no clue will it increase performance..!! )
    2) The job fetches data only from remote objects(views) and no local objects..
    3) We are not allowed to create any object in remote db..( We will never get a grant for that,so no second thought about creating an objects in remote db to increase performance)
    If any one have encontered or experienced similar problems or got any suggestions to optmizie then please do help us out.Thank you all in advance..
    Edited by: 843561 on Aug 26, 2011 1:53 PM

    Dev_Indy wrote:
    Thanks Tubby for your suggestion, will give it a shot for sure and let you know how it worked!!No problem.
    Please do let us know how that works out for you :)

  • Distributed Queries

    I want to querry data from 2 tables reciding on another Oracle database based on a date value that is a variable.
    THe SQL statment below works:
    select t1.market_cd, t1.NT_LOC_ENTITY_CD, t1.NTI_NO
    FROM customer_order@phoenix t1, Customer_Order_line@phoenix t2
    WHERE t1.nt_LOC_ENTITY_CD = '515'
    and t1.NTI_NO NOT LIKE 'Y%'
    AND customer_po NOT LIKE 'TD0000%'
    AND customer_po NOT LIKE 'TDMN%'
    and Customer_NO NOT LIKE '20352%'
    and t1.nti_no = t2.nti_no AND
    t1.bo_no = t2.Bo_no AND
    contract_annix not like 'B06%' AND
    contract_annix not like 'B17%' AND
    LINE_ITEM_SEQ_NO='0000' AND
    t2.actual_ship_date IS NULL AND
    (t2.Orig_sched_ship_date = '30-DEC-00'OR t2.Orig_cust_req_date <= '30-DEC-00');
    When I try to replace the hardcoded date '30-DEC-00' above with a variable as shown below. I get errors.
    As BEGIN
    SELECT sysdate INTO todaysDate from Dual;
    INSERT INTO Uma
    select t1.market_cd, t1.NT_LOC_ENTITY_CD, t1.NTI_NO
    FROM customer_order@phoenix t1, Customer_Order_line@phoenix t2
    WHERE t1.nt_LOC_ENTITY_CD = '515'
    and t1.NTI_NO NOT LIKE 'Y%'
    AND customer_po NOT LIKE 'TD0000%'
    AND customer_po NOT LIKE 'TDMN%'
    and Customer_NO NOT LIKE '20352%'
    and t1.nti_no = t2.nti_no AND
    t1.bo_no = t2.Bo_no AND
    contract_annix not like 'B06%' AND
    contract_annix not like 'B17%' AND
    LINE_ITEM_SEQ_NO='0000' AND
    t2.actual_ship_date IS NULL AND
    (t2.Orig_sched_ship_date = todaysDate OR t2.Orig_cust_req_date <= '30-DEC-00');
    end;
    Can any one tell be how I can rewrite this querry to use variables.
    Thank you.

    Hi,
    I don't know if that may be a workaround but You may try:
    - Create a view at the PHOENIX db as:
    CREATE OR REPLACE
    VIEW Customer_Full_Orders
    AS
    SELECT t1.market_cd,
    t1.nt_loc_entity_cd,
    t1.nti_no,
    customer_po,
    customer_no,
    contract_annix,
    line_item_seq_no,
    t2.actual_ship_date,
    t2.orig_sched_ship_date,
    t2.orig_cust_req_date
    FROM customer_order t1,
    customer_order_line t2
    WHERE t1.nti_no = t2.nti_no
    AND t1.bo_no = t2.bo_no
    /If the filters conditions may be hard-coded You may skip some field and include the filter in the view.
    - Recreate the procedure as:
    DECLARE
    v_Filter_Date DATE := SYSDATE;
    BEGIN
    INSERT
    INTO uma
    SELECT r.market_cd,
    r.nt_loc_entity_cd,
    r.nti_no
    FROM Customer_Full_Orders@PHOENIX r
    WHERE r.nt_loc_entity_cd = '515'
    AND r.nti_no NOT LIKE 'Y%'
    AND r.customer_po NOT LIKE 'TD0000%'
    AND r.customer_po NOT LIKE 'TDMN%'
    AND r.contract_annix NOT LIKE 'B06%'
    AND r.contract_annix NOT LIKE 'B17%'
    AND r.actual_ship_date IS NULL
    AND (r.orig_sched_ship_date = v_Filter_Date
    OR r.orig_cust_req_date <= v_Filter_Date
    END;
    /Hope this is usefull.
    Bye Max
    null

  • Trouble Turneing on FileStream/Ad Hoc Query with new SQL Server 2014 64-BIt Version

    SQL Server Help,
    I have recently installed SQL SERVER 2014 on my Window 7 SP1 Acer laptop.
    I am trying to do a simple upload of an Excel file into SQL Server using the OPENROWSET command:
    Which I found requires some configuration change I attempted below.
    When I execute this:
    execute sp_configure 'show advanced options', 1;
    RECONFIGURE;
    execute sp_configure 'Ad Hoc Distributed Queries', 1;
    RECONFIGURE;
    GO
    I get this message
    Configuration option 'show advanced options' changed from 1 to 1. Run the RECONFIGURE statement to install.
    Msg 5593, Level 16, State 1, Line 2
    FILESTREAM feature is not supported on WoW64. The feature is disabled.
    Msg 15123, Level 16, State 1, Procedure sp_configure, Line 62
    The configuration option 'Ad Hoc Distributed Queries' does not exist, or it may be an advanced option.
    Msg 5593, Level 16, State 1, Line 4
    FILESTREAM feature is not supported on WoW64. The feature is disabled.
    When I go to the SQL Server Services and Properties on the Server and try to change the FileStream options to be on..I get the error:
    "There was an unknown error applying the FILESTREAM settings.  Check the parameters are valid (0x80070002)."
    Perhaps my issue is with the Windows Share Name:  .  I have put in a local driver C:\A_Excel....also the computer name, my user ID.   Is there a special format or way to define a Share Name that I am unaware of?
    Is there a way to fix this?  Can I not load Excel files with a 64-Bit system?  Or what is the easiest way that I can do this with my configuration?
    Thanks for any help.

    FILESTREAM feature is not supported on WoW64. The feature is disabled.
    By the error message it seems you have a 64 bit OS, but have installed a 32 bit SQL Server and here in WoW64 mode filestream seems to be not supported. Install a 64 bit SQL Server instead.
    Olaf Helper
    [ Blog] [ Xing] [ MVP]

  • SQL Server Agent and OpenDataSource failing

    I have a stored procedure that is using an OpenDataSource query to export data from SQL Server into a excel file for download. I’ve installed the “Microsoft Access database engine 2010”, and can see it listed in SQL Server Management Studio under Server
    Objects > Linked Servers > Providers > Microsoft.ACE.OLEDB.12.0.
    If I manually run the stored procedure (when logged in as the OS’s administrator), the stored procedure runs fine, and the export is completed successfully.
    The stored procedure is set up to be run periodically by a SQL Server Agent job.
    However, if I start the SQL Server Agent job (when logged in as the OS’s administrator), the job fails with the following error message:
    Executed as user: NT SERVICE\SQLSERVERAGENT. Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)". [SQLSTATE 42000] (Error 7303) OLE DB provider "Microsoft.ACE.OLEDB.12.0"
    for linked server "(null)" returned message "Unspecified error". [SQLSTATE 01000] (Error 7412). The step failed.
    I have given the “NT SERVICE\SQLSERVERAGENT” user full control permissions on the destination directory where the export file resides (note that it starts out as an empty file just with the column headers defined).
    I’ve applied the following configuration settings to SQL Server:
    sp_configure 'show advanced options', 1;
    RECONFIGURE;
    GO
    sp_configure 'Ad Hoc Distributed Queries', 1;
    RECONFIGURE;
    GO
    EXEC master.dbo.sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0', N'AllowInProcess', 1
    GO
    EXEC master.dbo.sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0', N'DynamicParameters', 1
    GO
    What am I missing? Can anyone shed any light on what might be causing the failure?
    This is running on SQL Server Web Edition 2012 on Windows Server 2012 Standard.
    Thanks!

    Hi JonathanT1,
    First, I want to inform that, if SQL Server Agent log on account is “NT SERVICE\SQLSERVERAGENT”, we need to grant read and write permission to the destination folder to this account (you have done). From the following message:
    Executed as user: NT SERVICE\SQLSERVERAGENT
    It seems that you filled “NT SERVICE\SQLSERVERAGENT” in job step properties, “Run as user” option. I suggest changing it to another account who is in SQLAgentUserRole, SQLAgentReaderRole or SQLAgentOperatorRole. For more detail information, you can refer
    to the following link:
    Create a Transact-SQL Job Step
    http://technet.microsoft.com/en-us/library/ms187910.aspx
    SQL Server Agent Fixed Database Roles
    http://technet.microsoft.com/en-us/library/ms188283.aspx
    Allen Li
    TechNet Community Support

  • SQL Server 2012 64 bit - reading data from dbf (DBase 3/4/5 or VFoxPro) - solution and examples

    It took me quite a while to find a solution that will enable me to read dbf files from 64 bit SQL server, so i decided to create this guide for all of you guys searching for an answer.
    Download and install Access Database Engine 64 bit : http://www.microsoft.com/en-us/download/details.aspx?id=13255
    Creating Linked server for me is the most elegant solution 
    To create a Linked Server modify following sql statement with your data and execute it:
    EXEC master.dbo.sp_addlinkedserver 
    @server = N'DBFServer', 
    @srvproduct=N'', 
    @provider=N'Microsoft.ACE.OLEDB.12.0', 
    @datasrc=N'D:\DBF\', 
    @provstr=N'dBASE IV'
    You can read data directly without linked server, but i find this less elegant
    To read data directly modify following sql statement with your data and execute it:
    Use KTCDB
    GO
    sp_configure 'show advanced options', 1
    reconfigure 
    GO
    sp_configure 'Ad Hoc Distributed Queries', 1 
    reconfigure
    GO
    EXEC master.dbo.sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0' , N'AllowInProcess' , 1;
    EXEC master.dbo.sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0' , N'DynamicParameters' , 1;
    SELECT * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
                           'dBASE IV;Database=D:\DBF\;', 
                           'SELECT * FROM ODMRAD')
    EXEC master.dbo.sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0' , N'AllowInProcess' , 0;
    EXEC master.dbo.sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0' , N'DynamicParameters' , 0;
    GO
    sp_configure 'Ad Hoc Distributed Queries', 0
    reconfigure
    GO
    sp_configure 'show advanced options', 0
    reconfigure 
    GO
    I hope this will help you guys avoid frustration and wasting lots of time.

    Visakh, i believe you are mixing terms 'reading' and 'importing', as with solution i proposed you can consume 'live' data without the need to
    import it on every change.
    To clarify : Data in DBF's can change at any point in time, and with my solution whenever you try to read data from either linked server or directly, you will always have most recent data, contrary to the import methods where you are cloning data and actually
    could run into many problems synchronizing original and cloned data.
    Ok..I understand that
    Based on your data volatility you can configure an automated job for executing SSIS package to extract latest of data from dbase. But again it wont be "in real time". There will be slight delay depending on frequency you choose for job.
    I was suggesting it more from a datawarehousing perspective whether data refresh happens only after predefined period (daily,6hrs etc)
    If requirement is to get realtime data when you desire then definitely linked server would be the way to go.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

Maybe you are looking for

  • Create a new Row in an advanced table in an advanced table

    Hi, I am handling the master detail relationship. For this i have used advanced table in an advanced table. I am facing a problem while creating a new row in the inner advanced table. I have a Display Sequence column in the inner advanced table. I wa

  • View and join Table

    Hi,   what is the differnce between view and join table

  • Premiere elements 10 must activate mpeg2 componet web site given no good

    adobe premiere elements 10 must activate mpeg2 componet. web site given no good

  • Acrobat error when starting visio 2003

    when I start up visio 2003 I get this error "Visio could not find this add-on or one of its files.  An error occurred and this feature is no longer functioning properly.  Would you like to repair this feature now?" I run visio on close computer witho

  • Avid Artist Series And LogicX

    I use Avid artist Mix and Transport with Logic Pro and Mavericks, via hard wired ethernet and a routeur. I also use an Airport for my wifi internet connexion. I have very frequent conexion problems and the Avid equipement does not work. The instructi